All Products
Search
Document Center

DataWorks:AnalyticDB for MySQL

Last Updated:Mar 26, 2026

AnalyticDB for MySQL nodes in DataWorks let you write and run SQL tasks against an AnalyticDB for MySQL cluster, schedule those tasks automatically, and chain them with other node types in the same workflow.

Prerequisites

Before you begin, ensure that you have:

Background Information

AnalyticDB for MySQL is an analytical database of Alibaba Cloud. For more information, see What is AnalyticDB for MySQL?

Step 1: Create an AnalyticDB for MySQL node

  1. Go to the DataStudio page. Log on to the DataWorks console. In the top navigation bar, select the region. In the left-side navigation pane, choose Data Development and O\&M > Data Development. Select the workspace from the drop-down list, then click Go to Data Development.

  2. In the workflow panel, right-click the workflow name and choose Create Node > AnalyticDB for MySQL > ADB for MySQL.

  3. In the Create Node dialog box, enter a Name and click Confirm. The node opens on its configuration tab, ready for SQL development and scheduling configuration.

Step 2: Develop an AnalyticDB for MySQL task

Select a computing resource (optional)

If multiple AnalyticDB for MySQL computing resources are associated with the workspace, select the one to use on the configuration tab before writing task code. If only one computing resource is associated, it is used by default.

Write SQL code

In the code editor on the configuration tab, write the SQL task code. For example:

SHOW TABLES;

Step 3: Configure scheduling properties

To run the node on a recurring schedule, click Properties in the right-side navigation pane on the configuration tab and configure the scheduling settings. See Node scheduling configuration for the full list of options.

Note

Configure the Rerun and Parent Nodes parameters on the Properties tab before committing the task.

Step 4: Debug the task code

Debugging has three steps: validate node configuration, run the SQL and check results, and optionally perform smoke testing.

  1. Validate node configuration (optional). Click the 高级运行 icon in the top toolbar. In the Parameters dialog box, select the scheduling resource group to use for the debug run. If your SQL uses scheduling parameters, assign values to those parameters here. See Task debugging process for value assignment rules.

  2. Run the SQL and check results. Click the 保存 icon to save, then click the 运行 icon to execute the SQL statements. Review the output to confirm the query returns what you expect.

  3. Run smoke testing (optional). Smoke testing runs the node in the development environment to catch configuration or data issues before the task goes to production. Run smoke testing when you commit the node or immediately after. See Perform smoke testing.

Step 5: Commit and deploy the task

After the task is configured and tested, commit and deploy it so it runs automatically on the schedule you configured.

  1. Click the 保存 icon to save the task.

  2. Click the 提交 icon to commit the task. In the Submit dialog box, enter a Change description.

    Note

    The Rerun and Parent Nodes parameters on the Properties tab must be configured before you can commit. Optionally, enable code review to require approval before the task is deployed. See Code review.

  3. Standard mode workspaces only: After committing, click Deploy in the upper-right corner of the configuration tab to deploy the task to the production environment. See Publish tasks.

What's next

After the task is deployed, it runs automatically on the schedule you configured. To monitor run status and history, click Operation Center in the upper-right corner of the configuration tab. See Manage auto triggered tasks.