×
Community Blog Learning about AIACC-AGSpeed | AGSpeed Performance Data

Learning about AIACC-AGSpeed | AGSpeed Performance Data

This article describes the performance data of AIACC-AGSpeed (AGSpeed) in training models.

This article describes the performance data of AIACC-AGSpeed (AGSpeed) in training models. Compared with the native PyTorch Eager training accelerator, AGSpeed delivers significantly improved performance.

Background Information

The example in this article tests the performance of AGSpeed when applied to different model training scenarios.

Performance Data

In this example, more than 50 models, including hf_GPT2, hf_Bert, resnet50, and timm_efficientnet, are trained based on the precisions of FP32 and automatic mixed precision (AMP). The following figure shows the performance data of each model in different scenarios.

  • FP32 training scenarios

1

  • AMP training scenarios

2

The following section provides a description of the x-axis and y-axis in the figure.

  • x-axis: shows all models that are trained.
  • y-axis: indicates the improvement ratio of AGSpeed over PyTorch Eager measured by performance metrics. If the value is greater than 1.0, the performance is improved.

Test Results

The following table describes the performance improvement ratio of AGSpeed compared to the native PyTorch Eager. The example in this topic uses throughput as the sole performance indicator. The improvement effect is calculated based on the following formula: Performance improvement ratio = (throughput (AGSpeed) -throughput (Eager)) / throughput (Eager).

Note
The following section shows the performance data of some test models.

TABLE

0 1 0
Share on

Alibaba Cloud Community

876 posts | 198 followers

You may also like

Comments

Alibaba Cloud Community

876 posts | 198 followers

Related Products