Data Management (DMS) has customized several Airflow Operators to help you securely use resources managed by DMS. You can choose operators based on your requirements.
Prerequisites
Procedure
For more information, see Workflow development.
Access the WORKSPACE or REPOS code page.
Configure the code in the Python file.
Operator
Description
DMSSqlOperator
Submits SQL to a database instance managed by DMS for execution and retrieves the results.
DTSLakeInjectionOperator
Uses DTS capabilities to synchronize data from a database managed by DMS to Object Storage Service (OSS).
DMSNotebookOperator
Executes a Notebook file (.pynb) managed by DMS.
DMSAnalyticDBSparkSqlOperator
Submits Spark SQL to a specific resource group managed by Data Lakehouse Edition AnalyticDB MySQL (with task type Interactive and DPI engine Spark).
Run the Python file in the Airflow interface or by using REST API.