All Products
Search
Document Center

Data Management:Airflow DMS Operator

Last Updated:Jul 23, 2025

Data Management (DMS) has customized several Airflow Operators to help you securely use resources managed by DMS. You can choose operators based on your requirements.

Prerequisites

Procedure

Note

For more information, see Workflow development.

  1. Access the WORKSPACE or REPOS code page.

  2. Configure the code in the Python file.

    Operator

    Description

    DMSSqlOperator

    Submits SQL to a database instance managed by DMS for execution and retrieves the results.

    DTSLakeInjectionOperator

    Uses DTS capabilities to synchronize data from a database managed by DMS to Object Storage Service (OSS).

    DMSNotebookOperator

    Executes a Notebook file (.pynb) managed by DMS.

    DMSAnalyticDBSparkSqlOperator

    Submits Spark SQL to a specific resource group managed by Data Lakehouse Edition AnalyticDB MySQL (with task type Interactive and DPI engine Spark).

  3. Run the Python file in the Airflow interface or by using REST API.