All Products
Search
Document Center

Alibaba Cloud Model Studio:MiniMax

Last Updated:Mar 11, 2026

Call MiniMax models on Alibaba Cloud Model Studio.

Important

This document applies only to the Chinese mainland region. To use the model, get an API key from the Chinese mainland region.

Model overview

MiniMax-M2.5 is the latest model in the MiniMax series. It excels at coding, office tasks, and text summarization, and delivers fast output.

Model

Context window

Max input

Max CoT + response

The thinking_budget parameter is not supported.

(tokens)

MiniMax-M2.5

196,608

196,601

32,768

Only thinking mode is supported.
These models are not third-party services. They are deployed on Model Studio servers.

Getting started

Prerequisites: Before you start, create an API key and set it as an environment variable. If you call the model using a SDK, install the OpenAI or DashScope SDK.

OpenAI compatible

Python

Sample code

import os
from openai import OpenAI

client = OpenAI(
    api_key=os.getenv("DASHSCOPE_API_KEY"),
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

completion = client.chat.completions.create(
    model="MiniMax-M2.5",
    messages=[{"role": "user", "content": "Who are you?"}],
    stream=True,
)

reasoning_content = ""  # Full chain-of-thought
answer_content = ""     # Full response
is_answering = False    # Whether response has started

print("\n" + "=" * 20 + "Chain of thought" + "=" * 20 + "\n")

for chunk in completion:
    if chunk.choices:
        delta = chunk.choices[0].delta
        # Collect only chain-of-thought content
        if hasattr(delta, "reasoning_content") and delta.reasoning_content is not None:
            if not is_answering:
                print(delta.reasoning_content, end="", flush=True)
            reasoning_content += delta.reasoning_content
        # Start response when content arrives
        if hasattr(delta, "content") and delta.content:
            if not is_answering:
                print("\n" + "=" * 20 + "Full response" + "=" * 20 + "\n")
                is_answering = True
            print(delta.content, end="", flush=True)
            answer_content += delta.content

Response

====================Chain of thought====================

The user asked "Who are you?".

I should reply and introduce myself as an AI assistant.
====================Full response====================

Hello! I am MiniMax-M2.5, an AI assistant. I can help you answer questions, provide information, hold conversations, and more. How can I help you?

Node.js

Sample code

import OpenAI from "openai";
import process from 'process';

// Initialize the OpenAI client
const openai = new OpenAI({
    // If you have not set the environment variable, replace this with your Alibaba Cloud Model Studio API key: apiKey: "sk-xxx"
    apiKey: process.env.DASHSCOPE_API_KEY,
    baseURL: 'https://dashscope.aliyuncs.com/compatible-mode/v1'
});

let reasoningContent = ''; // Full chain-of-thought
let answerContent = ''; // Full response
let isAnswering = false; // Whether response has started

async function main() {
    const messages = [{ role: 'user', content: 'Who are you?' }];

    const stream = await openai.chat.completions.create({
        model: 'MiniMax-M2.5',
        messages,
        stream: true,
    });

    console.log('\n' + '='.repeat(20) + 'Chain of thought' + '='.repeat(20) + '\n');

    for await (const chunk of stream) {
        if (chunk.choices?.length) {
            const delta = chunk.choices[0].delta;
            // Collect only chain-of-thought content
            if (delta.reasoning_content !== undefined && delta.reasoning_content !== null) {
                if (!isAnswering) {
                    process.stdout.write(delta.reasoning_content);
                }
                reasoningContent += delta.reasoning_content;
            }

            // Start response when content arrives
            if (delta.content !== undefined && delta.content) {
                if (!isAnswering) {
                    console.log('\n' + '='.repeat(20) + 'Full response' + '='.repeat(20) + '\n');
                    isAnswering = true;
                }
                process.stdout.write(delta.content);
                answerContent += delta.content;
            }
        }
    }
}

main();

Response

====================Chain of thought====================

The user asked "Who are you?".

I should reply and introduce myself as an AI assistant.
====================Full response====================

Hello! I am MiniMax-M2.5, an AI assistant. I can help you answer questions, provide information, hold conversations, and more. How can I help you?

HTTP

Sample code

curl

curl -X POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions \
-H "Authorization: Bearer $DASHSCOPE_API_KEY" \
-H "Content-Type: application/json" \
-d '{
    "model": "MiniMax-M2.5",
    "messages": [
        {
            "role": "user", 
            "content": "Who are you?"
        }
    ]
}'

Response

{
    "choices": [
        {
            "message": {
                "content": "Hello! I am MiniMax-M2.5, an AI assistant developed by MiniMax. I can help you answer questions, provide information, hold conversations, and complete various text-related tasks. How can I help you?",
                "reasoning_content": "The user asked \"Who are you?\".\n\nI should reply and introduce myself.",
                "role": "assistant"
            },
            "finish_reason": "stop",
            "index": 0,
            "logprobs": null
        }
    ],
    "object": "chat.completion",
    "usage": {
        "prompt_tokens": 40,
        "completion_tokens": 72,
        "total_tokens": 112,
        "completion_tokens_details": {
            "reasoning_tokens": 26
        },
        "prompt_tokens_details": {
            "cached_tokens": 0
        }
    },
    "created": 1771944590,
    "system_fingerprint": null,
    "model": "MiniMax-M2.5",
    "id": "chatcmpl-b1277a9c-52da-9de7-988a-d5c063d83xxx"
}

DashScope

Python

Sample code

import os
from dashscope import Generation

# Initialize request parameters
messages = [{"role": "user", "content": "Who are you?"}]

completion = Generation.call(
    # If you have not set the environment variable, replace this with your Alibaba Cloud Model Studio API key: api_key="sk-xxx"
    api_key=os.getenv("DASHSCOPE_API_KEY"),
    model="MiniMax-M2.5",
    messages=messages,
    result_format="message",  # Set result format to message
    stream=True,              # Enable streaming output
    incremental_output=True,  # Enable incremental output
)

reasoning_content = ""  # Full chain-of-thought
answer_content = ""     # Full response
is_answering = False    # Whether response has started

print("\n" + "=" * 20 + "Chain of thought" + "=" * 20 + "\n")

for chunk in completion:
    message = chunk.output.choices[0].message
    
    # Collect only chain-of-thought content
    if message.reasoning_content:
        if not is_answering:
            print(message.reasoning_content, end="", flush=True)
        reasoning_content += message.reasoning_content

    # Start response when content arrives
    if message.content:
        if not is_answering:
            print("\n" + "=" * 20 + "Full response" + "=" * 20 + "\n")
            is_answering = True
        print(message.content, end="", flush=True)
        answer_content += message.content

# After the loop ends, reasoning_content and answer_content contain the full content
# You can process them further as needed
# print(f"\n\nFull chain of thought:\n{reasoning_content}")
# print(f"\nFull response:\n{answer_content}")

Response

====================Chain of thought====================

The user asked "Who are you?".

I should reply and introduce myself as an AI assistant.
====================Full response====================

Hello! I am MiniMax-M2.5, an AI assistant. I can help you answer questions, provide information, hold conversations, and more. How can I help you?

Java

Sample code

// DashScope SDK version >= 2.19.4
import com.alibaba.dashscope.aigc.generation.Generation;
import com.alibaba.dashscope.aigc.generation.GenerationParam;
import com.alibaba.dashscope.aigc.generation.GenerationResult;
import com.alibaba.dashscope.common.Message;
import com.alibaba.dashscope.common.Role;
import com.alibaba.dashscope.exception.ApiException;
import com.alibaba.dashscope.exception.InputRequiredException;
import com.alibaba.dashscope.exception.NoApiKeyException;
import io.reactivex.Flowable;
import java.lang.System;
import java.util.Arrays;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class Main {
    private static final Logger logger = LoggerFactory.getLogger(Main.class);
    private static StringBuilder reasoningContent = new StringBuilder();
    private static StringBuilder finalContent = new StringBuilder();
    private static boolean isFirstPrint = true;

    private static void handleGenerationResult(GenerationResult message) {
        String reasoning = message.getOutput().getChoices().get(0).getMessage().getReasoningContent();
        String content = message.getOutput().getChoices().get(0).getMessage().getContent();

        if (reasoning != null && !reasoning.isEmpty()) {
            reasoningContent.append(reasoning);
            if (isFirstPrint) {
                System.out.println("====================Chain of thought====================");
                isFirstPrint = false;
            }
            System.out.print(reasoning);
        }

        if (content != null && !content.isEmpty()) {
            finalContent.append(content);
            if (!isFirstPrint) {
                System.out.println("\n====================Full response====================");
                isFirstPrint = true;
            }
            System.out.print(content);
        }
    }
    private static GenerationParam buildGenerationParam(Message userMsg) {
        return GenerationParam.builder()
                // If you have not set the environment variable, replace this line with: .apiKey("sk-xxx")
                .apiKey(System.getenv("DASHSCOPE_API_KEY"))
                .model("MiniMax-M2.5")
                .incrementalOutput(true)
                .resultFormat("message")
                .messages(Arrays.asList(userMsg))
                .build();
    }
    public static void streamCallWithMessage(Generation gen, Message userMsg)
            throws NoApiKeyException, ApiException, InputRequiredException {
        GenerationParam param = buildGenerationParam(userMsg);
        Flowable<GenerationResult> result = gen.streamCall(param);
        result.blockingForEach(message -> handleGenerationResult(message));
    }

    public static void main(String[] args) {
        try {
            Generation gen = new Generation();
            Message userMsg = Message.builder().role(Role.USER.getValue()).content("Who are you?").build();
            streamCallWithMessage(gen, userMsg);
            // Print final results
            // if (reasoningContent.length() > 0) {
            //     System.out.println("\n====================Full response====================");
            //     System.out.println(finalContent.toString());
            // }
        } catch (ApiException | NoApiKeyException | InputRequiredException e) {
            logger.error("An exception occurred: {}", e.getMessage());
        }
        System.exit(0);
    }
}

Response

====================Chain of thought====================

The user asked "Who are you?".

I should reply and introduce myself as an AI assistant.
====================Full response====================

Hello! I am MiniMax-M2.5, an AI assistant. I can help you answer questions, provide information, hold conversations, and more. How can I help you?

HTTP

Sample code

curl

curl -X POST "https://dashscope.aliyuncs.com/api/v1/services/aigc/text-generation/generation" \
-H "Authorization: Bearer $DASHSCOPE_API_KEY" \
-H "Content-Type: application/json" \
-d '{
    "model": "MiniMax-M2.5",
    "input":{
        "messages":[      
            {
                "role": "user",
                "content": "Who are you?"
            }
        ]
    },
    "parameters": {
        "result_format": "message"
    }
}'

Response

{
    "output": {
        "choices": [
            {
                "finish_reason": "stop",
                "message": {
                    "content": "Hello! I am MiniMax-M2.5, an AI assistant developed by MiniMax. I can help you answer questions, provide information, hold conversations, and complete various text-related tasks. How can I help you?",
                    "reasoning_content": "The user asked \"Who are you?\".\n\nI should reply and introduce myself. I should state that I am MiniMax-M2.5, an AI assistant developed by MiniMax.",
                    "role": "assistant"
                }
            }
        ]
    },
    "usage": {
        "input_tokens": 41,
        "output_tokens": 79,
        "output_tokens_details": {
            "reasoning_tokens": 39
        },
        "prompt_tokens_details": {
            "cached_tokens": 0
        },
        "total_tokens": 120
    },
    "request_id": "1bbd770e-564a-4601-83fc-3bf639423xxx"
}

Model features

Model

Multi-turn conversation

Deep thinking

Function calling

Structured output

Web search

Partial mode

Context cache

MiniMax-M2.5

Supported

Supported

Supported

Not supported

Supported

Not supported

Supported

Implicit cache only.

MiniMax-M2.1

Supported

Supported

Supported

Not supported

Supported

Not supported

Not supported

Default parameter values

Model

temperature

top_p

presence_penalty

MiniMax-M2.5

1.0

0.95

0.0

MiniMax-M2.1

1.0

0.95

0.0

Error codes

If a model call fails and returns an error message, see error messages for troubleshooting.