OpenClaw is an open source AI assistant platform that supports multiple messaging channels. Set up and use Alibaba Cloud Model Studio's Coding Plan with OpenClaw.
Install OpenClaw
Install or update Node.js (v22.0 or later).
Install OpenClaw.
macOS/Linux
In Terminal:
curl -fsSL https://openclaw.ai/install.sh | bashWindows
In PowerShell:
iwr -useb https://openclaw.ai/install.ps1 | iexFollow the on-screen prompts to configure OpenClaw:
Configuration
Description
I understand this is powerful and inherently risky. Continue?
Select Yes
Onboarding mode
Select QuickStart
Model/auth provider
Select Skip for now. You can configure this later.
Filter models by provider
Select All providers
Default model
Use the default configuration
Select channel (QuickStart)
Select Skip for now. You can configure this later.
Configure skills now? (recommended)
Select No. You can configure this later.
Enable hooks?
Press the space bar to select it. Then select Skip for now and press Enter to continue.
How do you want to hatch your bot?
Select Hatch in TUI
Set up Coding Plan
Web UI
Open the Web UI.
openclaw dashboardIn the Web UI, navigate to .
Paste the following content into the Raw JSON field. Replace the existing content.
To keep your existing configuration, do not replace all content at once. For safe editing, see How to safely modify an existing configuration?
Replace
YOUR_API_KEYwith your Coding Plan exclusive API key.
{ "models": { "mode": "merge", "providers": { "bailian": { "baseUrl": "https://coding-intl.dashscope.aliyuncs.com/v1", "apiKey": "YOUR_API_KEY", "api": "openai-completions", "models": [ { "id": "qwen3.5-plus", "name": "qwen3.5-plus", "reasoning": false, "input": ["text", "image"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 1000000, "maxTokens": 65536 }, { "id": "qwen3-max-2026-01-23", "name": "qwen3-max-2026-01-23", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 262144, "maxTokens": 65536 }, { "id": "qwen3-coder-next", "name": "qwen3-coder-next", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 262144, "maxTokens": 65536 }, { "id": "qwen3-coder-plus", "name": "qwen3-coder-plus", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 1000000, "maxTokens": 65536 }, { "id": "MiniMax-M2.5", "name": "MiniMax-M2.5", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 204800, "maxTokens": 131072 }, { "id": "glm-5", "name": "glm-5", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 202752, "maxTokens": 16384 }, { "id": "glm-4.7", "name": "glm-4.7", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 202752, "maxTokens": 16384 }, { "id": "kimi-k2.5", "name": "kimi-k2.5", "reasoning": false, "input": ["text", "image"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 262144, "maxTokens": 32768 } ] } } }, "agents": { "defaults": { "model": { "primary": "bailian/qwen3.5-plus" }, "models": { "bailian/qwen3.5-plus": {}, "bailian/qwen3-max-2026-01-23": {}, "bailian/qwen3-coder-next": {}, "bailian/qwen3-coder-plus": {}, "bailian/MiniMax-M2.5": {}, "bailian/glm-5": {}, "bailian/glm-4.7": {}, "bailian/kimi-k2.5": {} } } }, "gateway": { "mode": "local" } }Click Save. Then, click Update to apply.
After saving, the API key displays as “__OPENCLAW_REDACTED__” in the UI. This does not affect actual API calls.

Terminal
Open the config file.
nano ~/.openclaw/openclaw.jsonPaste the following content into the config file. Replace
YOUR_API_KEYwith your Coding Plan exclusive API key.To keep your existing configuration, do not replace all content at once. For safe editing, see How to safely modify an existing configuration?
{ "models": { "mode": "merge", "providers": { "bailian": { "baseUrl": "https://coding-intl.dashscope.aliyuncs.com/v1", "apiKey": "YOUR_API_KEY", "api": "openai-completions", "models": [ { "id": "qwen3.5-plus", "name": "qwen3.5-plus", "reasoning": false, "input": ["text", "image"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 1000000, "maxTokens": 65536 }, { "id": "qwen3-max-2026-01-23", "name": "qwen3-max-2026-01-23", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 262144, "maxTokens": 65536 }, { "id": "qwen3-coder-next", "name": "qwen3-coder-next", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 262144, "maxTokens": 65536 }, { "id": "qwen3-coder-plus", "name": "qwen3-coder-plus", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 1000000, "maxTokens": 65536 }, { "id": "MiniMax-M2.5", "name": "MiniMax-M2.5", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 204800, "maxTokens": 131072 }, { "id": "glm-5", "name": "glm-5", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 202752, "maxTokens": 16384 }, { "id": "glm-4.7", "name": "glm-4.7", "reasoning": false, "input": ["text"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 202752, "maxTokens": 16384 }, { "id": "kimi-k2.5", "name": "kimi-k2.5", "reasoning": false, "input": ["text", "image"], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 262144, "maxTokens": 32768 } ] } } }, "agents": { "defaults": { "model": { "primary": "bailian/qwen3.5-plus" }, "models": { "bailian/qwen3.5-plus": {}, "bailian/qwen3-max-2026-01-23": {}, "bailian/qwen3-coder-next": {}, "bailian/qwen3-coder-plus": {}, "bailian/MiniMax-M2.5": {}, "bailian/glm-5": {}, "bailian/glm-4.7": {}, "bailian/kimi-k2.5": {} } } }, "gateway": { "mode": "local" } }Save the file and exit. Then, restart the gateway to apply.
openclaw gateway restart
Use OpenClaw
Use OpenClaw through the Web UI or TUI.
Web UI
Open the Web UI.
openclaw dashboardStart a conversation.

TUI
Start the TUI.
openclaw tuiStart a conversation.

Switch models
During a session (temporary)
In the TUI, use
/model <model name>to switch models for the current session./model qwen3-coder-nextThe message “model set to qwen3-coder-next” confirms the change.
Change the default model (permanent)
To use a specific model by default, update the
agents.defaults.model.primaryfield in the config file. See Modify config file.{ "agents": { "defaults": { "model": { "primary": "bailian/qwen3.5-plus" } } } }
FAQ
How do I view available models?
In the TUI, run /model to view the model list. Press Enter to select a model. Press Esc to exit.

Why do I get “HTTP 401: Incorrect API key provided”?
Possible causes:
The API key is invalid, expired, or incorrectly formatted. Verify that you are using a Coding Plan exclusive API key with no extra spaces, and that your subscription is active.
A cached configuration is causing the error. Delete the
providers.bailiansection from~/.openclaw/agents/main/agent/models.json, and then restart OpenClaw.
How to safely modify an existing configuration?
Do not replace all content at once. A full replacement overwrites your existing settings such as channel configurations. Instead, make partial edits.
Using the Coding Plan configuration as a reference, locate the
models,agents, andgatewayfields in the existing config file, and merge the Coding Plan values into the corresponding fields. If any of these fields do not exist, add them.
For more questions, see the FAQ.