DMS provides two programmatic interfaces for managing Apache Airflow workflows: the REST API and the command-line interface (CLI). Both let you automate DAG (Directed Acyclic Graph) management without accessing the Airflow console directly.
REST API: Trigger DAG runs, monitor task status, and manage Airflow resources through HTTP requests — suited for integrating Airflow into external applications or CI/CD pipelines.
CLI: Run Airflow commands over HTTP — useful for scripting DAG operations and administrative tasks.
For the full Apache Airflow REST API reference, see Airflow REST API documentation.
Prerequisites
Before you begin, make sure you have:
An Airflow instance in DMS
The Airflow instance ID (
af-xxxx), DMS tenant ID, and workspace IDThe
alibabacloud_dms20250414SDK installed (required for the Python example only)cURL or Python with the
requestslibrary installed
How it works
Both the REST API and CLI use the same two-step authentication flow:
Call the CreateAirflowLoginToken API operation to get a login token. The token is valid for 2 hours — if it expires before you log in, request a new one.
Log in to the Airflow instance using the token. DMS returns a session cookie, which you then pass in subsequent REST API or CLI requests.
Use the REST API
Step 1: Get a login token
Call the CreateAirflowLoginToken API operation to get your login token.
The token is valid for 2 hours. If you don't log in within that window, request a new token before proceeding.
Step 2: Log in and get a session cookie
Send a login request with cURL:
curl -i "https://data-cn-beijing-dms.aliyuncs.com/airflow/<tenant-id>/<workspace-id>/<airflow-instance-id>/login?token=<token>"Replace the placeholders with your actual values:
| Placeholder | Description | Example |
|---|---|---|
data-cn-beijing-dms.aliyuncs.com | Regional endpoint for your Airflow instance | data-ap-southeast-1-dms.aliyuncs.com |
<tenant-id> | DMS tenant ID | 33*** |
<workspace-id> | DMS workspace ID | 8691522017**** |
<airflow-instance-id> | Airflow instance ID | af-b3a7797**** |
<token> | Login token from step 1 |
A successful response returns HTTP 302 and sets a session cookie:
HTTP/2 302
set-cookie: session=xxxxxxx; Expires=Sat, 05 Jul 2025 09:07:18 GMT; Secure; HttpOnly; Path=/; SameSite=None; PartitionedUse the session cookie value in all subsequent API requests.
Step 3: Call the REST API
Pass the session cookie in requests to any Airflow REST API endpoint. The following example checks instance health:
curl 'https://data-cn-beijing-dms.aliyuncs.com/airflow/<tenant-id>/<workspace-id>/<airflow-instance-id>/api/v1/health' \
-b 'session=<session-cookie>'Expected response:
{
"dag_processor": {
"latest_dag_processor_heartbeat": null,
"status": null
},
"metadatabase": {
"status": "healthy"
},
"scheduler": {
"latest_scheduler_heartbeat": "2025-06-05T09:13:03.075907+00:00",
"status": "healthy"
},
"triggerer": {
"latest_triggerer_heartbeat": null,
"status": null
}
}Use the CLI
Step 1: Get a login token
Call the CreateAirflowLoginToken API operation to get your login token.
The token is valid for 2 hours. If it expires before you log in, request a new one.
Step 2: Log in and get a session cookie
The login step is identical to the REST API flow. Send the login request and save the returned session cookie value.
curl -i "https://data-cn-beijing-dms.aliyuncs.com/airflow/<tenant-id>/<workspace-id>/<airflow-instance-id>/login?token=<token>"Step 3: Run CLI commands
Send a POST request to the /api/v1/command endpoint with the Airflow CLI command as the command query parameter. The following example checks the Airflow version:
curl -X POST \
'https://data-cn-beijing-dms.aliyuncs.com/airflow/<tenant-id>/<workspace-id>/<airflow-instance-id>/api/v1/command?command=version' \
-b 'session=<session-cookie>'Expected response:
{
"stderr": "",
"stdout": "2.10.4\n"
}Call the REST API from Python
This example shows how to authenticate and list DAGs using the DMS Python SDK and the requests library.
Step 1: Install the DMS SDK
pip install alibabacloud_dms20250414Step 2: Set up credentials
Store your AccessKey credentials as environment variables. Avoid hardcoding credentials in source code.
export ALIBABA_CLOUD_ACCESS_KEY_ID=<your-access-key-id>
export ALIBABA_CLOUD_ACCESS_KEY_SECRET=<your-access-key-secret>For other credential configuration options, see Configure credentials.
Step 3: Create and run the script
Create a file named dms_rest_api.py with the following content:
# -*- coding: utf-8 -*-
import os
import sys
from typing import List
from alibabacloud_dms20250414.client import Client as Dms20250414Client
from alibabacloud_credentials.client import Client as CredentialClient
from alibabacloud_dms20250414.models import CreateAirflowLoginTokenResponseBodyData
from alibabacloud_tea_openapi import models as open_api_models
from alibabacloud_dms20250414 import models as dms_20250414_models
from alibabacloud_tea_util import models as util_models
from alibabacloud_tea_util.client import Client as UtilClient
import requests
# Supported regions and their DMS endpoints
endpoints = {
"cn-beijing": "dms.cn-beijing.aliyuncs.com",
"cn-hangzhou": "dms.cn-hangzhou.aliyuncs.com",
"cn-shanghai": "dms.cn-shanghai.aliyuncs.com",
"cn-shenzhen": "dms.cn-shenzhen.aliyuncs.com",
"ap-southeast-1": "dms.ap-southeast-1.aliyuncs.com"
}
class DmsAirflowRestApi:
def __init__(self, endpoint: str):
# Initialize the client using credentials from environment variables
credential = CredentialClient()
config = open_api_models.Config(credential=credential)
config.endpoint = endpoint # See https://api.aliyun.com/product/Dms for all endpoints
self.client = Dms20250414Client(config)
def get_login_token(self, airflowId: str) -> CreateAirflowLoginTokenResponseBodyData:
# Call CreateAirflowLoginToken to get a login token for the Airflow instance
request = dms_20250414_models.CreateAirflowLoginTokenRequest(airflow_id=airflowId)
runtime = util_models.RuntimeOptions()
try:
response = self.client.create_airflow_login_token_with_options(request, runtime)
return response.body.data # Returns token and host URL
except Exception as error:
print(error.message)
print(error.data.get("Recommend"))
UtilClient.assert_as_string(error.message)
def get_session_cookie(self, login_token: CreateAirflowLoginTokenResponseBodyData):
# Log in to the Airflow instance and extract the session cookie
login_url = f'{login_token.host}/login?token={login_token.token}'
try:
response = requests.get(login_url)
response.raise_for_status()
if 'session' in response.cookies:
return response.cookies['session'] # Use this cookie in subsequent API calls
return None
except Exception as e:
print(f"Error: {e}")
return None
def list_dags(self, login_token: CreateAirflowLoginTokenResponseBodyData, session_cookie: str):
# List all DAGs in the Airflow instance
url = f'{login_token.host}/api/v1/dags'
try:
response = requests.get(url, cookies={'session': session_cookie})
response.raise_for_status()
print(response.json())
except Exception as e:
print(f"Error: {e}")
if __name__ == '__main__':
region = sys.argv[1] # e.g., cn-hangzhou
airflowId = sys.argv[2] # e.g., af-sxsssxx
endpoint = endpoints.get(region)
api = DmsAirflowRestApi(endpoint)
login_token = api.get_login_token(airflowId)
session_cookie = api.get_session_cookie(login_token)
api.list_dags(login_token, session_cookie)Run the script:
python3 dms_rest_api.py <region-id> <airflow-instance-id>For example:
python3 dms_rest_api.py cn-hangzhou af-sxsssxxTo get your Airflow instance ID, call the CreateAirflowLoginToken API operation and check the response.
Supported CLI commands
The following commands are supported. For usage details and parameters, see the Apache Airflow CLI reference.
DAG management
dags backfill # Backfill DAG runs for a date range
dags delete # Delete a DAG
dags list # List all DAGs
dags list-import-errors # Show DAG import errors
dags list-jobs # List jobs for a DAG
dags list-runs # List DAG runs
dags next-execution # Show the next execution time
dags pause # Pause a DAG
dags report # Show DAG report
dags reserialize # Force re-serialization of all DAGs
dags show # Display a DAG's structure
dags state # Get the state of a DAG run
dags test # Test a DAG run without writing to the database
dags trigger # Trigger a DAG run
dags unpause # Unpause a DAGTask management
tasks clear # Clear task instances
tasks failed-deps # List dependencies blocking a task from running
tasks list # List tasks in a DAG
tasks render # Render a task's template fields
tasks state # Get the state of a task instance
tasks states-for-dag-run # Get task states for a specific DAG run
tasks test # Test a task instance without writing to the databaseConnection management
connections add # Add a connection
connections delete # Delete a connectionVariable management
variables delete # Delete a variable
variables get # Get a variable value
variables list # List all variables
variables set # Set a variable valueProvider management
providers behaviours # List registered provider behaviours
providers get # Get provider details
providers hooks # List provider hooks
providers list # List all installed providers
providers links # List provider links
providers notifications # List provider notification classes
providers secrets # List provider secret backends
providers triggerer # List provider triggerers
providers widgets # List provider widgetsUtilities
cheat-sheet # Display a cheat sheet of common commands
db clean # Clean old DAG run records from the database
version # Print the Airflow version