All Products
Search
Document Center

Data Management:Set up and manage Airflow environments

Last Updated:Dec 05, 2025

This topic describes how to create an Airflow instance, add a linked account, and create a code repository.

Prerequisites

Ensure that the required instance resources are prepared as described in Preparations.

Billing

The fees for an Airflow instance are based on its Workflow Specifications (the number of CUs used). The unit price is displayed on the Configure Resources page.

Step 1: Create an Airflow instance

  1. Log on to the DMS console V5.0.
  2. Go to the Workspace page.

    In the upper-left corner of the console, click the 2023-01-28_15-57-17.png icon and choose All Features > Data+AI > Workspace.

    Note

    If you are not using the simplified console, choose Data+AI > Workspace from the top menu bar.

    screenshot_2025-08-28_10-26-34

  3. Click the name of the target workspace to open it, or create a new workspace.

  4. In the navigation pane on the left of the workspace, choose image > Airflow Instance, and then click Create Instance.

  5. Configure the instance.

    The following table describes some important parameters.

    Configuration Item

    Description

    Workflow Specification

    Select specifications based on the scale and complexity of your workflows. For more information, see Airflow specifications.

    Worker Node Extension

    Airflow automatically adjusts the number of nodes based on the task load. The number of worker nodes can range from 1 to 10.

    VPC ID

    No changes are needed. By default, this is the same as the VPC of the workspace.

    vSwitch

    Select the target vSwitch.

    Security Group

    Select the security group to control the workflow.

    OSS Bucket

    Select an Object Storage Service (OSS) Bucket in the same region as the workspace.

    OSS Path

    Enter the path for data storage that you created during the preparations.

  6. Click Submit.

    The resources are deployed when the instance status changes to Running.

Step 2: Add a linked account

Note

Linked accounts are independent. Other users in the same workspace cannot view the resources associated with your linked account.

  1. In the upper-right corner of the workspace, click your profile picture, and then click the image icon to create a linked account.

    image

  2. In the New Service Provider Account dialog box, select an account Type.

    DMS currently supports three account types: GitHub, Apsara Devops Codeup, and Private GitLab.

    image

  3. Select a Creation Method and enter the Username, Password, or Access token.

    For the Creation Method, you can import an account using a username and password or a user token.

    .

  4. Click OK.

Step 3: Create a code repository

  1. In the navigation pane on the left of the workspace page, click the image icon to go to the EXPLORER page.

  2. In the CODE (Code Repository) area, click the image icon and select Add existing git repository.

    screenshot_2025-08-27_17-21-03

  3. Enter a Project Name, select the corresponding Git Provider and Git Repository URL, and then click OK.

    If you use Alibaba Cloud services, set Git Provider to CodeUp. DMS then selects a CodeUp linked account by default.

    image

    After the repository is created, its name appears in the repository list.

Step 4: Develop code

  1. To the right of the target code repository name, click the branch name, which is 'master' by default. You can then switch branches, create new branches, edit code, or save your code.

    Note

    The save operation is equivalent to the 'git push' command.

  2. Confirm the environment.

    Hover the mouse pointer over the repository name, click the image icon, and then confirm the environment and parameter configurations.

  3. Hover the mouse pointer over the repository name and click the image button.

  4. In the dialog box that appears, click OK to deploy the code.

    image

Step 5: View published tasks in the Airflow space

  1. In the navigation pane on the left of the workspace page, click the image icon.

  2. Click the Airflow instance under the target repository to view the published tasks.

    image

    You can click a Directed Acyclic Graph (DAG) name to view the execution results.

    image

Appendix: Airflow specifications

Note

Both PostgreSQL and Redis are high-availability (HA) instances.

Workflow Specifications

Specifications

Number of Replicas

Description

Web Servers

Workers

Schedulers

PostgreSQL

Redis

Web Servers

Workers

Schedulers

Small

1 vCPU, 4 GB RAM each

1 vCPU, 4 GB RAM each

1 vCPU, 4 GB RAM each

2 vCPU, 4 GB RAM

1 GB

2

1

2

  • A maximum of 50 DAGs (task flows) are recommended.

  • Each worker has a default degree of parallelism of 5.

Center

1 vCPU, 4 GB RAM each

2 vCPU, 8 GB RAM each

2 vCPU, 8 GB RAM each

2 vCPU, 8 GB RAM

2 GB

  • A maximum of 250 DAGs are recommended.

  • Each worker has a default degree of parallelism of 10.

Large

2 vCPU, 8 GB RAM each

4 vCPU, 16 GB RAM each

4 vCPU, 16 GB RAM each

2 vCPU, 8 GB RAM

4 GB

  • A maximum of 100 DAGs are recommended.

  • Each worker has a default degree of parallelism of 20.

Extra Large

4 vCPU, 16 GB RAM each

8 vCPU, 32 GB RAM each

8 vCPU, 32 GB RAM each

4 vCPU, 32 GB RAM

8 GB

  • A maximum of 2,000 DAGs are recommended.

  • Each worker has a default degree of parallelism of 40.

2XL

8 vCPU, 32 GB RAM each

16 vCPU, 64 GB RAM each

16 vCPU, 64 GB RAM each

8 vCPU, 64 GB RAM

16 GB

  • A maximum of 4,000 DAGs are recommended.

  • Each worker has a default degree of parallelism of 80.