Apsara AI Accelerator (AIACC) is an artificial intelligence (AI) accelerator developed by Alibaba Cloud. It consists of a training accelerator (AIACC-Training) and an inference accelerator (AIACC-Inference). AIACC-Inference can accelerate the major AI computing framework TensorFlow and exportable frameworks in the Open Neural Network Exchange (ONNX) format to achieve significant gains in inference performance. This topic describes how to automatically install AIACC-Inference and test the demo.

Background information

Conda is an open source package management system and environment management system that can run on different platforms. Miniconda is a small installer of conda. When you create a GPU-accelerated instance, you can configure a conda environment that contains AIACC-Inference to be installed automatically. You can use Miniconda to select a conda environment and use AIACC-Inference to improve inference performance.

ONNX is an open source format in which trained models are stored. You can convert data of models of different frameworks such as Pytorch and MXNet to the ONNX format. This makes it easy to test models of different frameworks in the same environment.

Automatically install AIACC-Inference

AIACC-Inference depends on the GPU driver, CUDA, and cuDNN. When you create a GPU-accelerated instance, select Auto-install GPU Driver and Auto-install AIACC-Inference. Then, select a CUDA version, a GPU driver, and a cuDNN version. After the instance is created, you can configure a conda environment that contains AIACC-Inference based on the CUDA version. For more information about how to create a GPU-accelerated instance, see Create an NVIDIA GPU-accelerated instance.install-aiacc-reference

Test demo

  1. Connect to the instance. For more information, see Overview.
  2. Select a conda environment.
    1. Initialize Miniconda.
      . /root/miniconda/etc/profile.d/conda.sh
    2. View all conda environments.
      conda env list
      The following figure shows an example command output.aiacc-inference-envlist
    3. Select a conda environment.
      conda activate [environments_name]
      The following figure shows an example command output.aiacc-training-activate
  3. Test the demo.
    By default, the aiacc_inference_demo.tgz demo file is located in the /root directory. In this example, the ONNX demo is tested.
    1. Decompress the demo test package.
      tar -xvf aiacc_inference_demo.tgz
    2. Go to the ONNX demo directory.
      cd /root/aiacc_inference_demo/aiacc_inference_onnx/resnet50v1
    3. Run the test script in the directory.
      Sample command:
      python3 test.py
      The test script executes inference tasks and randomly generates and classifies images based on a ResNet50 model. This reduces the amount of time required for inference per task from 6.4 ms to less than 1.5 ms. The following figure shows an example inference result.onnx-demo

Delete Miniconda

You can delete Miniconda if you no longer need AIACC-Inference. By default, the root user can install and delete Miniconda.

  1. Delete the miniconda folder.
    rm -rf /root/miniconda
  2. Delete relevant environment variables and output.
    1. Modify the /root/.bashrc file and comment out the environment variables and output related to Miniconda and AIACC-Inference, as shown in the following figure.
      bashrc-file
    2. Make the changes to the environment variables take effect.
      source /root/.bashrc