All Products
Search
Document Center

Function Compute (2.0):Image usage notes

Last Updated:Feb 22, 2024

Images are commonly used in AI scenarios. We recommend that you use images as deliverables in AI scenarios of Function Compute. Images in AI and big data scenarios are typically gigabytes in size. Function Compute increases image size limits and supports image acceleration. This topic describes image size limits, image acceleration methods, and usage of public base images.

Size limits on compressed images

The following table describes the size limits on compressed images. The limits vary based on the types and editions of Container Registry instances. For more information about the billing of Container Registry, see Billing rules.

Container Registry Edition

Limit on image size (GB)

Payable

Container Registry Enterprise Edition (Standard Edition)

10

Yes

Container Registry Enterprise Edition (Advanced Edition)

10

Yes

Container Registry Enterprise Edition (Basic Edition)

10

Yes

Container Registry Personal Edition

10

No

Common image acceleration

By default, Function Compute enables the image acceleration feature for instances of Container Registry Personal Edition and Container Registry Enterprise Edition. This reduces the time required for image pulling from minutes to seconds and reduces the time required for cold starts of large images. For more information, see the following topics:

Use a driver-independent container image

Do not add driver-related components to your image. In addition, make sure that your application does not depend on specific driver versions. For example, do not include the libcuda.so library that provides the CUDA Driver API in your image. This dynamic library is strongly related to the kernel driver version of a device. If the dynamic library does not match the driver version, exceptions may occur.

When creating a function instance, Function Compute injects driver-related user-state components into the container in advance. These components match the driver version provided by Function Compute. In GPU container virtualization technologies such as NVIDIA Container Runtime, specific driver tasks are performed by platform resource providers to make GPU container images more adaptable to environments. NVIDIA provides the drivers that are used by GPU-accelerated instances of Function Compute. The driver versions used by the instances may change as a result of feature iteration, new card releases, bug fixes, and driver lifecycle expiration.

If you are already using GPU container virtualization technologies such as NVIDIA Container Runtime, do not use docker commit commands to create images. Images created in this method contain injected driver-related components. When you use such images in Function Compute, application exceptions may occur due to undefined behaviors caused by unmatching component version.

Alibaba Cloud base images for Function Compute

Alibaba Cloud provides various base images for GPU acceleration scenarios of Function Compute. These base images ensure optimal compatibility and performance and allow you to build business logic with ease. We recommend that you preferentially use a base image to build your business logic.

These base images include mainstream frameworks for machine learning and images of popular model platforms, such as PyTorch, TensorFlow, and PaddlePaddle. You can use base images to get started with high performance computing in GPU acceleration scenarios with ease. These base images are preloaded with environments and dependencies that are required to run images on Function Compute. You can use these images without the need to install and configure environments and dependencies. The base images help improve the performance and reliability of applications. The following table describes the base images provided by Alibaba Cloud for GPU acceleration scenarios of Function Compute.

Image family

Address (registry-vpc prefix can be used in VPC-based image pulling)

Image tag

Framework version

Python version

CUDA version

Ubuntu version

modelscope

registry.{cn-hangzhou|us-east-1|ap-northeast-1}.aliyuncs.com/serverless_devs/modelscope

ubuntu20.04-cuda11.3.0-py37-torch1.11.0-tf1.15.5-1.5.0

N/A

3.7

11.3.0

20.04

PyTorch

registry.{cn-hangzhou|us-east-1|ap-northeast-1}.aliyuncs.com/serverless_devs/pytorch

22.12-py3

1.14.0

3.8

11.8.0

TensorFlow

registry.{cn-hangzhou|us-east-1|ap-northeast-1}.aliyuncs.com/serverless_devs/tensorflow

22.12-tf1-py3

1.15.5

22.12-tf2-py3

2.10.1

PaddlePaddle

registry.{cn-hangzhou|us-east-1|ap-northeast-1}.aliyuncs.com/serverless_devs/paddlepaddle

22.12-py3

2.3.2

22.04

CUDA

registry.{cn-hangzhou|us-east-1|ap-northeast-1}.aliyuncs.com/serverless_devs/cuda

11.8.0-devel-ubuntu22.04

N/A

N/A

Important
  • Base images are available only in regions where GPU acceleration is available. The regions include China (Hangzhou), China (Shanghai), China (Beijing), China (Zhangjiakou), China (Shenzhen), Japan (Tokyo), and US (Virginia). For information about the IDs of the regions in which Container Registry is supported, see Regions.

  • Base images are available only for users of Container Registry Personal Edition. This feature cannot improve performance for users of Container Registry Enterprise Edition because of data isolation issues.

Benefits of base images

Base images of Function Compute provide the following benefits in serverless GPU scenarios:

  • Better compatibility

    Base images for serverless GPU scenarios in Function Compute are optimized and tested for GPU-accelerated instances to ensure better application compatibility and higher application stability.

  • Improved performance

    Frameworks and data reading of base images for GPU-accelerated instances in Function Compute are optimized to provide improved end-to-end performance and user experience. In addition, the base images contain some common computing libraries, such as NumPy and TensorFlow, which can help you write high-performance code with ease.

  • Simplified build process

    You can directly use base images to build your business logic in Function Compute. This eliminates the need to manually configure related settings such as NumPy and SciPy.

Base images help you better build business logic and achieve higher performance and better compatibility. We recommend that you use the base images provided by Function Compute in GPU scenarios when you build your business logic.

How to use a base image

Base images can be easily used. You need to only add the base image that you want to use in the Dockerfile when you build your business logic. For example, to use the PyTorch base image in a GPU function in the China (Shanghai) region, you can add the following code to your Dockerfile:

FROM registry.cn-shanghai.aliyuncs.com/serverless_devs/pytorch:22.12-py3

ADD . .
EXPOSE 9000

Function Compute provides addresses for you to conveniently pull images. For example, you can run the following command to pull a PyTorch base image:

docker pull registry.cn-shanghai.aliyuncs.com/serverless_devs/pytorch:22.12-py3

FAQ

Is there any difference between the base images of Function Compute and the images provided by NVIDIA?

The images are the same. You do not need to worry about the compatibility issue.

If the version of the framework that I want to use is later than that of the base image provided by Function Compute, can high performance still be ensured?

Yes. Container images are layered. Therefore, specific data is the same between a framework of an earlier version and a later version, which means high performance can still be ensured.

What do I do if I cannot find a required base image?

We recommend that you join the DingTalk group 11721331 for technical support.

Are base images compatible with different types of GPUs?

Yes. Base images are compatible with all types of GPUs of GPU-accelerated instances in Function Compute. The on-demand mode and provisioned mode can be used as expected.