All Products
Search
Document Center

Alibaba Cloud Model Studio:OpenAI Vision interface compatibility

Last Updated:Jan 06, 2026

Qwen-VL models from Alibaba Cloud Model Studio are compatible with OpenAI API specifications. To migrate your existing OpenAI applications to Model Studio, you only need to modify the following three parameters:

  • Update the base_url parameter to https://dashscope-intl.aliyuncs.com/compatible-mode/v1.

    If you use a model in the Virginia region, set the base_url to https://dashscope-us.aliyuncs.com/compatible-mode/v1. If you use a model in the Beijing region, set the base_url to https://dashscope.aliyuncs.com/compatible-mode/v1.
  • Replace the api_key with your Model Studio API key.

  • model: Specify a model from the following list.

Supported models

Global

  • Qwen-VL series models: qwen3-vl-plus, qwen3-vl-plus-2025-09-23, qwen3-vl-flash, qwen3-vl-flash-2025-10-15, qwen3-vl-235b-a22b-thinking, qwen3-vl-235b-a22b-instruct, qwen3-vl-32b-instruct, qwen3-vl-30b-a3b-thinking, qwen3-vl-30b-a3b-instruct, qwen3-vl-8b-thinking, qwen3-vl-8b-instruct

  • Qwen-OCR series models: qwen-vl-ocr, qwen-vl-ocr-2025-11-20

International

  • Qwen-VL series models

    • qwen3-vl-plus, qwen3-vl-plus-2025-12-19, qwen3-vl-plus-2025-09-23, qwen3-vl-flash, qwen3-vl-flash-2025-10-15, qwen3-vl-235b-a22b-thinking, qwen3-vl-235b-a22b-instruct, qwen3-vl-32b-instruct, qwen3-vl-30b-a3b-thinking, qwen3-vl-30b-a3b-instruct, qwen3-vl-8b-thinking, qwen3-vl-8b-instruct

    • qwen-vl-max, qwen-vl-max, qwen-vl-max-latest, qwen-vl-max-2025-08-13, qwen-vl-max-2025-04-08, qwen-vl-plus, qwen-vl-plus-latest, qwen-vl-plus-2025-08-15, qwen-vl-plus-2025-07-10, qwen-vl-plus-2025-05-07, qwen-vl-plus-2025-01-25, qwen2.5-vl-72b-instruct, qwen2.5-vl-32b-instruct, qwen2.5-vl-7b-instruct, qwen2.5-vl-3b-instruct

  • QVQ series models: qvq-max, qvq-max-latest, qvq-max-2025-03-25

  • Qwen-OCR series models: qwen-vl-ocr, qwen-vl-ocr-2025-11-20

US

qwen3-vl-flash-us, qwen3-vl-flash-2025-10-15-us

Mainland China

  • Qwen-VL series models

    • qwen3-vl-plus, qwen3-vl-plus-2025-12-19, qwen3-vl-plus-2025-09-23, qwen3-vl-flash, qwen3-vl-flash-2025-10-15, qwen3-vl-235b-a22b-thinking, qwen3-vl-235b-a22b-instruct, qwen3-vl-32b-instruct, qwen3-vl-30b-a3b-thinking, qwen3-vl-30b-a3b-instruct, qwen3-vl-8b-thinking, qwen3-vl-8b-instruct

    • qwen-vl-max, qwen-vl-max, qwen-vl-max-latest, qwen-vl-max-2025-08-13, qwen-vl-max-2025-04-08, qwen-vl-max-2025-04-02, qwen-vl-max-2025-01-25, qwen-vl-max-2024-12-30, qwen-vl-max-2024-11-19, qwen-vl-max-2024-10-30, qwen-vl-max-2024-08-09, qwen-vl-plus, qwen-vl-plus-latest, qwen-vl-plus-2025-08-15, qwen-vl-plus-2025-07-10, qwen-vl-plus-2025-05-07, qwen-vl-plus-2025-01-25, qwen-vl-plus-2025-01-02, qwen-vl-plus-2024-08-09, qwen2.5-vl-72b-instruct, qwen2.5-vl-32b-instruct, qwen2.5-vl-7b-instruct, qwen2.5-vl-3b-instruct, qwen2-vl-72b-instruct, qwen2-vl-7b-instruct, qwen2-vl-2b-instruct

  • QVQ series models: qvq-max, qvq-max-latest, qvq-max-2025-03-25

  • Qwen-OCR series models: qwen-vl-ocr, qwen-vl-ocr-latest, qwen-vl-ocr-2025-11-20, qwen-vl-ocr-2025-08-28, qwen-vl-ocr-2025-04-13, qwen-vl-ocr-2024-10-28

Call models

Sample request

This section provides examples of streaming calls in Python (using the OpenAI SDK and LangChain_OpenAI SDK) and cURL (using the HTTP interface). For more examples in other programming languages or with different input methods, see Visual understanding request examples.

QVQ models support only streaming output. For QVQ models, see Visual reasoning.

Using the OpenAI SDK

from openai import OpenAI
import os


def get_response():
    client = OpenAI(
    # API keys vary by region. To get an API key, see https://www.alibabacloud.com/help/en/model-studio/get-api-key.
    api_key=os.getenv("DASHSCOPE_API_KEY"),
    # The following is the base_url for the Singapore region. If you use a model in the Virginia region, change the base_url to https://dashscope-us.aliyuncs.com/compatible-mode/v1.
    # If you use a model in the Beijing region, replace the base_url with https://dashscope.aliyuncs.com/compatible-mode/v1.
    base_url="https://dashscope-intl.aliyuncs.com/compatible-mode/v1",
)
    completion = client.chat.completions.create(
        model="qwen3-vl-plus",
        messages=[
            {
              "role": "user",
              "content": [
                {
                  "type": "text",
                  "text": "What is this"
                },
                {
                  "type": "image_url",
                  "image_url": {
                    "url": "https://dashscope.oss-cn-beijing.aliyuncs.com/images/dog_and_girl.jpeg"
                  }
                }
              ]
            }
          ],
        stream=True,
        stream_options={"include_usage":True}
        )
    for chunk in completion:
        print(chunk.model_dump())

if __name__=='__main__':
    get_response()

After you run the code, the following result is returned:

{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [{'delta': {'content': '', 'function_call': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None}, 'finish_reason': None, 'index': 0, 'logprobs': None}], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': None}
{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [{'delta': {'content': 'This', 'function_call': None, 'refusal': None, 'role': None, 'tool_calls': None}, 'finish_reason': None, 'index': 0, 'logprobs': None}], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': None}
{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [{'delta': {'content': 'photo', 'function_call': None, 'refusal': None, 'role': None, 'tool_calls': None}, 'finish_reason': None, 'index': 0, 'logprobs': None}], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': None}
{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [{'delta': {'content': 'shows', 'function_call': None, 'refusal': None, 'role': None, 'tool_calls': None}, 'finish_reason': None, 'index': 0, 'logprobs': None}], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': None}

......

{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [{'delta': {'content': 'moment', 'function_call': None, 'refusal': None, 'role': None, 'tool_calls': None}, 'finish_reason': None, 'index': 0, 'logprobs': None}], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': None}
{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [{'delta': {'content': '.', 'function_call': None, 'refusal': None, 'role': None, 'tool_calls': None}, 'finish_reason': None, 'index': 0, 'logprobs': None}], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': None}
{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [{'delta': {'content': '', 'function_call': None, 'refusal': None, 'role': None, 'tool_calls': None}, 'finish_reason': 'stop', 'index': 0, 'logprobs': None}], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': None}
{'id': 'chatcmpl-31042a05-c968-4fc6-ba28-c3aa471258dc', 'choices': [], 'created': 1765780318, 'model': 'qwen-vl-plus', 'object': 'chat.completion.chunk', 'service_tier': None, 'system_fingerprint': None, 'usage': {'completion_tokens': 230, 'prompt_tokens': 1259, 'total_tokens': 1489, 'completion_tokens_details': {'accepted_prediction_tokens': None, 'audio_tokens': None, 'reasoning_tokens': None, 'rejected_prediction_tokens': None, 'text_tokens': 230}, 'prompt_tokens_details': {'audio_tokens': None, 'cached_tokens': 0}}}

Use the langchain_openai SDK

Prerequisites

  • Make sure that a Python environment is installed on your computer.

  • Run the following command to install the langchain_openai SDK.

    # If the following command fails, replace pip with pip3.
    pip install -U langchain_openai
  • Activate Alibaba Cloud Model Studio and obtain an API key: Create an API key.

  • We recommend that you configure the API key as an environment variable to reduce the risk of API key leaks: Set API key as environment variable. You can also hardcode the API key in your code, but this increases the risk of leaks.

Usage

The following examples demonstrate how to use Qwen-VL with the langchain_openai SDK.

Non-streaming output

Use the invoke method for non-streaming output, as shown in the following sample code:

from langchain_openai import ChatOpenAI
import os

def get_response():
    llm = ChatOpenAI(
      # API keys are region-specific. To obtain an API key, visit: https://www.alibabacloud.com/help/en/model-studio/get-api-key
      api_key=os.getenv("DASHSCOPE_API_KEY"),
      # The following base_url is for the Singapore region. If you use a model in the Virginia region, replace the base_url with https://dashscope-us.aliyuncs.com/compatible-mode/v1.
      # If you use a model in the Beijing region, replace the base_url with: https://dashscope.aliyuncs.com/compatible-mode/v1.
      base_url="https://dashscope-intl.aliyuncs.com/compatible-mode/v1",
      model="qwen3-vl-plus",
      )
    messages= [
            {
              "role": "user",
              "content": [
                {
                  "type": "text",
                  "text": "What is this?"
                },
                {
                  "type": "image_url",
                  "image_url": {
                    "url": "https://dashscope.oss-cn-beijing.aliyuncs.com/images/dog_and_girl.jpeg"
                  }
                }
              ]
            }
          ]
    response = llm.invoke(messages)
    print(response.content)

if __name__ == "__main__":
    get_response()

After you run the code, the following result is returned:

{
  "content": "In the picture, a woman and her dog are interacting on the beach. The dog is sitting on the ground, extending its paw as if to shake hands or give a high five. The woman is wearing a plaid shirt and seems to be having an intimate interaction with the dog, and is smiling. The background is the ocean and the sky at sunrise or sunset. This is a heartwarming photo that shows a moment of friendship between a person and a pet.",
  "additional_kwargs": {
    "refusal": null
  },
  "response_metadata": {
    "token_usage": {
      "completion_tokens": 267,
      "prompt_tokens": 1259,
      "total_tokens": 1526,
      "completion_tokens_details": {
        "accepted_prediction_tokens": null,
        "audio_tokens": null,
        "reasoning_tokens": null,
        "rejected_prediction_tokens": null,
        "text_tokens": 267
      },
      "prompt_tokens_details": {
        "audio_tokens": null,
        "cached_tokens": 0
      }
    },
    "model_provider": "openai",
    "model_name": "qwen-vl-plus",
    "system_fingerprint": null,
    "id": "chatcmpl-9f3eba85-4f7a-4f73-b254-220a650xxxxx",
    "finish_reason": "stop",
    "logprobs": null
  },
  "type": "ai",
  "name": null,
  "id": "lc_run--019b1191-f411-7153-ac51-b8b0410xxxxx-0",
  "tool_calls": [],
  "invalid_tool_calls": [],
  "usage_metadata": {
    "input_tokens": 1259,
    "output_tokens": 267,
    "total_tokens": 1526,
    "input_token_details": {
      "cache_read": 0
    },
    "output_token_details": {}
  }
}

Streaming output

The following example does not apply to QVQ models. For QVQ models, see Visual reasoning.
from langchain_openai import ChatOpenAI
import os


def get_response():
    llm = ChatOpenAI(
        # API keys vary by region. To get an API key, see https://www.alibabacloud.com/help/en/model-studio/get-api-key.
        api_key=os.getenv("DASHSCOPE_API_KEY"),
        # The following is the base_url for the Singapore region. If you use a model in the Virginia region, change the base_url to https://dashscope-us.aliyuncs.com/compatible-mode/v1.
        # If you use a model in the Beijing region, replace the base_url with https://dashscope.aliyuncs.com/compatible-mode/v1.
        base_url="https://dashscope-intl.aliyuncs.com/compatible-mode/v1",
        model="qwen3-vl-plus",
        # With these settings, token usage information appears in the final chunk of the streaming output.
        stream_options={"include_usage": True}
    )
    messages= [
            {
              "role": "user",
              "content": [
                {
                  "type": "text",
                  "text": "What is this"
                },
                {
                  "type": "image_url",
                  "image_url": {
                    "url": "https://dashscope.oss-cn-beijing.aliyuncs.com/images/dog_and_girl.jpeg"
                  }
                }
              ]
            }
          ]
    response = llm.stream(messages)
    for chunk in response:
        print(chunk.json())

if __name__ == "__main__":
    get_response()

After you run the preceding code, the following sample result is returned:

{"content": "", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": "This", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " picture", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " shows", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " a", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " dog and a little", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " girl. The dog looks", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " friendly and may be", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " a pet, while the little girl", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " seems to be interacting", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " or playing with the dog.", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " This is a picture that shows", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " the warm relationship", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": " between humans and animals.", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": "", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": "", "additional_kwargs": {}, "response_metadata": {"finish_reason": "stop"}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": null, "tool_call_chunks": []}
{"content": "", "additional_kwargs": {}, "response_metadata": {}, "type": "AIMessageChunk", "name": null, "id": "run-xxx", "example": false, "tool_calls": [], "invalid_tool_calls": [], "usage_metadata": {"input_tokens": 23, "output_tokens": 40, "total_tokens": 63}, "tool_call_chunks": []}

For information about how to configure input parameters, see Input parameters. These parameters are defined in the ChatOpenAI object.

Use HTTP interface

You can access Qwen-VL models via the HTTP API. The response has the same structure as a response from the OpenAI service.

Prerequisites

  • Activate Alibaba Cloud Model Studio and obtain an API key: Create an API key.

  • We recommend that you configure the API key as an environment variable to reduce the risk of API key leaks: Set API key as environment variable. You can also hardcode the API key in your code, but this increases the risk of leaks.

Submitting an API call

Singapore: POST https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions
Virginia: POST https://dashscope-us.aliyuncs.com/compatible-mode/v1/chat/completions
Beijing: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions

Sample request

The following example shows a script that uses the curl command to call the API.

Note

If you have not configured the API key as an environment variable, you must replace $DASHSCOPE_API_KEY with your API key.

Non-streaming output

# ======= Important =======
# API keys vary by region. To get an API key, see https://www.alibabacloud.com/help/en/model-studio/get-api-key.
# The following is the base_url for the Singapore region. If you use a model in the Virginia region, change the base_url to https://dashscope-us.aliyuncs.com/compatible-mode/v1.
# If you use a model in the Beijing region, replace the base_url with https://dashscope.aliyuncs.com/compatible-mode/v1.
# === Delete this comment before execution ===

curl --location 'https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions' \
--header "Authorization: Bearer $DASHSCOPE_API_KEY" \
--header 'Content-Type: application/json' \
--data '{
  "model": "qwen3-vl-plus",
  "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "What are these"
        },
        {
          "type": "image_url",
          "image_url": {
            "url": "https://dashscope.oss-cn-beijing.aliyuncs.com/images/dog_and_girl.jpeg"
          }
        },
        {
          "type": "image_url",
          "image_url": {
            "url": "https://dashscope.oss-cn-beijing.aliyuncs.com/images/tiger.png"
          }
        }
      ]
    }
  ]
}'

After you run the command, the following result is returned:

{
  "choices": [
    {
      "message": {
        "content": "In Figure 1, a woman is interacting with her pet dog on the beach. The dog raises its front paw as if it wants to shake hands.\nFigure 2 is a CG-rendered picture of a tiger.",
        "role": "assistant"
      },
      "finish_reason": "stop",
      "index": 0,
      "logprobs": null
    }
  ],
  "object": "chat.completion",
  "usage": {
    "prompt_tokens": 2509,
    "completion_tokens": 34,
    "total_tokens": 2543
  },
  "created": 1724729556,
  "system_fingerprint": null,
  "model": "qwen-vl-plus",
  "id": "chatcmpl-1abb4eb9-f508-9637-a8ba-ac7fc6f73e53"
}

Streaming output

To use streaming output, set the stream parameter to true in the request body.

# ======= Important =======
# API keys vary by region. To get an API key, see https://www.alibabacloud.com/help/en/model-studio/get-api-key.
# The following is the base_url for the Singapore region. If you use a model in the Virginia region, change the base_url to https://dashscope-us.aliyuncs.com/compatible-mode/v1.
# If you use a model in the Beijing region, replace the base_url with https://dashscope.aliyuncs.com/compatible-mode/v1.
# === Delete this comment before execution ===

curl --location 'https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions' \
--header "Authorization: Bearer $DASHSCOPE_API_KEY" \
--header 'Content-Type: application/json' \
--data '{
    "model": "qwen3-vl-plus",
    "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "What is this"
        },
        {
          "type": "image_url",
          "image_url": {
            "url": "https://dashscope.oss-cn-beijing.aliyuncs.com/images/dog_and_girl.jpeg"
          }
        }
      ]
    }
  ],
    "stream":true,
    "stream_options":{"include_usage":true}
}'

After you run the command, the following result is returned:

data: {"choices":[{"delta":{"content":"","role":"assistant"},"index":0,"logprobs":null,"finish_reason":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"finish_reason":null,"delta":{"content":"In the"},"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" picture,"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" a woman"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" and her dog are"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" interacting on the beach. The dog is sitting on the ground,"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" extending its paw as if to shake hands or give"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" a high five. The woman is wearing a plaid"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" shirt and seems to be having an intimate"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" interaction with the dog, and is smiling."},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" The background is the ocean and the sky at sunrise or"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" sunset. This is"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"delta":{"content":" a heartwarming photo that shows"},"finish_reason":null,"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[{"finish_reason":"stop","delta":{"content":" a moment of friendship between a person and a pet."},"index":0,"logprobs":null}],"object":"chat.completion.chunk","usage":null,"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: {"choices":[],"object":"chat.completion.chunk","usage":{"prompt_tokens":1276,"completion_tokens":79,"total_tokens":1355},"created":1724729595,"system_fingerprint":null,"model":"qwen-vl-plus","id":"chatcmpl-4c83f437-303f-907b-9de5-79cac83d6b18"}

data: [DONE]

For more information about the input parameters, see Input parameters.

Error response example

If a request fails, the response includes the `code` and `message` fields to indicate the cause of the error.

{
    "error": {
        "message": "Incorrect API key provided. ",
        "type": "invalid_request_error",
        "param": null,
        "code": "invalid_api_key"
    }
}

Error codes

See Status codes.