Generative AI is shifting from "cool demos" to real-world production. But for many developers, the barrier to entry is still high. You have to learn new SDKs, worry about GPU provisioning, or deal with expensive API costs.
In this series, "GenAI on Alibaba Cloud," we are going to break down how to build powerful AI applications using the Qwen (Tongyi Qianwen, Chinese: 通义千问) model family. Qwen is consistently ranked as one of the top-performing LLMs globally, often rivaling GPT-4 in coding and reasoning benchmarks, but at a fraction of the cost.
In Episode 1, we aren't just running a "Hello World." We are going to learn the "Drop-in Replacement" technique. You will learn how to access Qwen using the standard tools you likely already know, making migration seamless.
Let's dive in.
Model Studio (formerly DashScope) is Alibaba Cloud’s "Model-as-a-Service" platform. Instead of buying a GPU and loading a model yourself, you simply use an API to chat with massive models running on Alibaba Cloud's infrastructure.
Why use it?
Before we write code, we need a key to the engine room.
1. Log in to the Alibaba Cloud Console.
2. Search for "Model Studio" in the top search bar.
3. Critical Step: If prompted to choose a region, select Singapore (or another International region close to you).
4. Navigate to Dashboard
5. Click Key Management.
6. Create the API Key and Copy the key (it starts with sk-). Save it somewhere safe
We will use Python. Instead of installing a proprietary Alibaba Cloud SDK, we will use the standard openai library.
pip install openai python-dotenv
Note: We install
Create a file named .env in your project folder and paste your key there:
DASHSCOPE_API_KEY=sk-your_actual_key_here_no_quotes
Now, create hello_qwen.py. This script demonstrates how to route standard OpenAI commands to Alibaba Cloud's Qwen-Plus model.
import os
from openai import OpenAI
from dotenv import load_dotenv
# 1. Load environment variables
load_dotenv()
def chat_with_qwen():
# 2. Initialize the Client
# This is the "Magic Trick". We use the OpenAI client, but we point
# the 'base_url' to Alibaba Cloud's International Endpoint.
client = OpenAI(
# Get key from .env file
api_key=os.getenv("DASHSCOPE_API_KEY"),
# CRITICAL: This URL connects to the Singapore/Intl gateway.
# If you are in China, you might use 'dashscope.aliyuncs.com' instead.
base_url="https://dashscope-intl.aliyuncs.com/compatible-mode/v1",
)
print("Connecting to Qwen...")
try:
# 3. Send the Prompt
# We are using 'qwen-plus', a great balance of speed and intelligence.
completion = client.chat.completions.create(
model="qwen-plus",
messages=[
{'role': 'system', 'content': 'You are a helpful expert on Cloud Computing.'},
{'role': 'user', 'content': 'Explain the concept of "Serverless" to a 10-year-old.'}
]
)
# 4. Print the Response
print("\n--- Qwen says: ---")
print(completion.choices[0].message.content)
print("------------------")
except Exception as e:
print(f"Error: {e}")
print("Tip: Check if your API Key matches the region in your base_url.")
if __name__ == '__main__':
chat_with_qwen()
Run the script in your terminal:
python hello_qwen.py
You should see an output similar to:
In the code above, we used model="qwen-plus". However, Alibaba Cloud offers a few flavors. Here is how to choose:
| Model Name | Best For... | Cost Profile |
|---|---|---|
| qwen-turbo | Simple chatbots, translation, summarizing short text. | Lowest (Very fast) |
| qwen-plus | General purpose assistant, content creation, reasoning. | Balanced (Standard choice) |
| qwen-max | Complex coding, math, legal analysis, or nuance. | Higher (Best performance) |
| qwen-vl-plus | Analyzing Images (Visual Language). | Varies |
Pro Tip: Start developing with qwen-turbo to save money/credits, then switch to plus or max for production if you need higher quality answers.
If you ran the code and got an error, check these common pitfalls:
Error:
● Cause: Your API Key region and your base_url do not match.
● Fix: If you created your key on the International site (alibabacloud.com), you must use https://dashscope-intl.aliyuncs.com/compatible-mode/v1. If you used the wrong URL, the system thinks your key doesn't exist.
Error:
● Cause: Usually an issue with the message format.
● Fix: Ensure your messages list has the correct structure: {'role': 'user', 'content': '...'}.
Congratulations! You have successfully integrated one of the world's most powerful LLMs into your Python application.
Since we set this up using the OpenAI-compatible client, you have unlocked a massive ecosystem. Any tool, library, or framework (like LangChain or AutoGen) that works with OpenAI will now work with Alibaba Cloud Qwen.
In Episode 2, we will put this to work. We will build a RAG (Retrieval Augmented Generation) system to make Qwen answer questions based on your own private PDF documents.
See you in the next episode!
Disclaimer: The views expressed herein are for reference only and don't necessarily represent the official views of Alibaba Cloud.
Building Serverless Applications with Alibaba Cloud Function Compute
GenAI on Alibaba Cloud [Part 2]: Chat with Your PDF (Building a RAG System)
14 posts | 0 followers
FollowAlibaba Cloud Native Community - November 21, 2025
Alibaba Cloud Community - December 5, 2025
Farah Abdou - December 21, 2025
Alibaba Cloud Philippines - August 19, 2024
Farah Abdou - December 5, 2025
Kidd Ip - September 30, 2025
14 posts | 0 followers
Follow
Tongyi Qianwen (Qwen)
Top-performance foundation models from Alibaba Cloud
Learn More
Alibaba Cloud for Generative AI
Accelerate innovation with generative AI to create new business success
Learn More
AI Acceleration Solution
Accelerate AI-driven business and AI model training and inference with Alibaba Cloud GPU technology
Learn More
Platform For AI
A platform that provides enterprise-level data modeling services based on machine learning algorithms to quickly meet your needs for data-driven operations.
Learn MoreMore Posts by Farah Abdou