AIACC-Inference can optimize image classification and detect models based on TensorFlow to improve inference performance. This topic describes how to manually install and use AIACC-Inference in TensorFlow.


  • An Alibaba Cloud GPU-accelerated instance is created.
    • Instance specifications: equipped with NVIDIA P100, V100, or T4 GPU
    • The image used by the instance: Ubuntu 18.04, Ubuntu 16.04, CentOS 8.x, or CentOS 7.x
  • All GPU-accelerated instances are installed.
    • Python 3.6
    • CUDA 10
    • cuDNN 7.4 or later
    • TensorFlow 1.14
    • TensorRT 5 or 6

Background information

To use AIACC-Inference in TensorFlow, you must call the interface of AIACC-Inference in TensorFlow to optimize models. For more information, see Use AIACC-Inference in TensorFlow.


  1. Connect to the instance. For more information, see Connect to a Linux instance by using Workbench.
  2. Install AIACC-Inference in TensorFlow
    pip3 install AIACC-Inference-TF-1.3.1-py3-none-any.whl