New Features

MaxCompute - MaxFrame AI Function released

Apr 09 2025

MaxCompute
MaxCompute introduces the MaxFrame AI Function feature, which can be used to call large language models (LLMs) for processing large amounts of data offline.
Content

Target customers: data development engineers and data scientists. New features and specifications: MaxFrame AI Function integrates LLMs such as Qwen2.5 and Deepseek-R1-Distill-Qwen. You can use simple programming interfaces to directly call the LLMs to process large amounts of data in MaxCompute tables offline. The LLMs are hosted offline in MaxCompute. You do not need to consider the download and distribution of models, and the maximum concurrency of API calls. Therefore, MaxFrame jobs that call MaxFrame AI Function can make full use of the massive computing resources of MaxCompute and complete text processing tasks based on LLM inference with high token throughput and concurrency.

7th Gen ECS Is Now Available

Increase instance computing power by up to 40% and Fully equipped with TPM chips.
Powered by Third-generation Intel® Xeon® Scalable processors (Ice Lake).

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.