Apsara AI Accelerator (AIACC) is an artificial intelligence (AI) accelerator developed
by Alibaba Cloud. It consists of a training accelerator (AIACC-Training) and an inference
accelerator (AIACC-Inference). AIACC-Inference can accelerate the major AI computing
framework TensorFlow and exportable frameworks in the Open Neural Network Exchange
(ONNX) format to achieve significant gains in inference performance. This topic describes
how to automatically install AIACC-Inference and test the demo.
Background information
Conda is an open source package management system and environment management system
that can run on different platforms. Miniconda is a small installer of conda. When
you create a GPU-accelerated instance, you can configure a conda environment that
contains AIACC-Inference to be installed automatically. You can use Miniconda to select
a conda environment and use AIACC-Inference to improve inference performance.
ONNX is an open source format in which trained models are stored. You can convert
data of models of different frameworks such as Pytorch and MXNet to the ONNX format.
This makes it easy to test models of different frameworks in the same environment.
Automatically install AIACC-Inference
AIACC-Inference depends on the GPU driver, CUDA, and cuDNN. When you create a GPU-accelerated
instance, select
Auto-install GPU Driver and
Auto-install AIACC-Inference. Then, select a CUDA version, a GPU driver, and a cuDNN version. After the instance
is created, you can configure a conda environment that contains AIACC-Inference based
on the CUDA version. For more information about how to create a GPU-accelerated instance,
see
Create an NVIDIA GPU-accelerated instance.

Test demo
- Connect to the instance. For more information, see Overview.
- Select a conda environment.
- Initialize Miniconda.
. /root/miniconda/etc/profile.d/conda.sh
- View all conda environments.
conda env list
The following figure shows an example command output.

- Select a conda environment.
conda activate [environments_name]
The following figure shows an example command output.

- Test the demo.
By default, the aiacc_inference_demo.tgz demo file is located in the /root directory. In this example, the ONNX demo is tested.
- Decompress the demo test package.
tar -xvf aiacc_inference_demo.tgz
- Go to the ONNX demo directory.
cd /root/aiacc_inference_demo/aiacc_inference_onnx/resnet50v1
- Run the test script in the directory.
Sample command:
python3 test.py
The test script executes inference tasks and randomly generates and classifies images
based on a ResNet50 model. This reduces the amount of time required for inference
per task from 6.4 ms to less than 1.5 ms. The following figure shows an example inference
result.

Delete Miniconda
You can delete Miniconda if you no longer need AIACC-Inference. By default, the root
user can install and delete Miniconda.
- Delete the miniconda folder.
- Delete relevant environment variables and output.
- Modify the /root/.bashrc file and comment out the environment variables and output related to Miniconda and
AIACC-Inference, as shown in the following figure.
- Make the changes to the environment variables take effect.