All Products
Search
Document Center

AnalyticDB:Use DMS to schedule AnalyticDB for MySQL jobs

Last Updated:Mar 28, 2026

The MySQL Event Scheduler works for simple, single-database tasks but lacks observability, cross-database support, and failure recovery. Data Management (DMS) task orchestration runs outside the database kernel, giving you a visual drag-and-drop editor, multi-engine support, scheduled execution, and built-in notifications and O&M controls.

This topic walks through a complete scheduling scenario: filter completed orders above $10,000 from the orders table and write results to a separate table on a daily schedule.

Why use DMS instead of MySQL Event Scheduler

LimitationMySQL Event SchedulerDMS task orchestration
ConfigurationRequires CREATE EVENT and ALTER EVENT syntax; no UIVisual drag-and-drop editor
ScopeSingle database onlyMultiple engines (MySQL, Oracle, PostgreSQL, SQL Server) and cross-database workflows
ObservabilityNo execution history, run duration, or statusFull run history and status in the O&M console
Failure handlingCannot pause, rerun, or recover failed eventsSupports pausing, stopping, and rerunning tasks
NotificationsNoneDingTalk, text message, and email
Kernel dependencyScheduler must be enabled at the kernel levelIndependent of database kernel

Prerequisites

Before you begin, ensure that you have:

  • An AnalyticDB for MySQL cluster (Enterprise Edition, Basic Edition, Data Lakehouse Edition, or Data Warehouse Edition)

  • Access to the AnalyticDB for MySQL console

  • The adb_test database and sample tables created (see Sample data)

The end-to-end data management feature is not supported for AnalyticDB for MySQL clusters in the Indonesia (Jakarta) region.

Sample data

The examples use a database named adb_test with three tables: orders, finish_orders, and large_finish_orders.

CREATE DATABASE adb_test;

CREATE TABLE orders (
  order_id     BIGINT        NOT NULL COMMENT 'Order ID',
  order_status VARCHAR(10)   NOT NULL COMMENT 'Order status',
  total_price  DECIMAL(15,2) NOT NULL COMMENT 'Total price',
  order_date   DATE          NOT NULL COMMENT 'Order date',
  PRIMARY KEY (order_id)
);

CREATE TABLE finish_orders (
  order_id    BIGINT        NOT NULL COMMENT 'Order ID',
  total_price DECIMAL(15,2) NOT NULL COMMENT 'Total price',
  order_date  DATE          NOT NULL COMMENT 'Order date',
  PRIMARY KEY (order_id)
);

CREATE TABLE large_finish_orders (
  order_id    BIGINT        NOT NULL COMMENT 'Order ID',
  total_price DECIMAL(15,2) NOT NULL COMMENT 'Total price',
  order_date  DATE          NOT NULL COMMENT 'Order date',
  PRIMARY KEY (order_id)
);

Schedule a task flow

The workflow has three steps:

  1. Create a task flow — set up the container for your tasks

  2. Add task nodes — configure two SQL tasks with a dependency between them

  3. Schedule and publish — enable periodic scheduling and release the task flow

Step 1: Create a task flow

  1. Log on to the AnalyticDB for MySQL console. In the upper-left corner, select a region. In the left-side navigation pane, click Clusters, then click the target cluster ID.

  2. In the left-side navigation pane, click Job Scheduling.

    If this is your first time accessing an AnalyticDB for MySQL database through the DMS console, you must enter your database logon credentials. For details, see Log on to a database. If you previously logged on to an AnalyticDB for MySQL cluster in Flexible Management or Stable Change mode without selecting Remember Password, a password prompt appears when you open the DMS console. For details about control modes, see Control modes.
  3. Hover over image > All Features.

  4. Choose Data+AI > Data Development > Task Orchestration.

  5. On the Task orchestration page, click Create Task Flow.

  6. In the New Task Flow dialog box, enter a Task Flow Name (for example, Order Filtering) and an optional Description, then click OK.

Step 2: Add task nodes

On the Order Filtering task orchestration page, add two Single Instance SQL nodes and connect them.

Add the Order Cleansing node

This node filters completed orders from orders and inserts them into finish_orders.

  1. In the left-side pane, drag Single Instance SQL onto the canvas.

  2. Select the node and click the 1 icon to rename it to Order Cleansing.

  3. Double-click the node (or click the 2 icon) to open the editor.

  4. From the database drop-down list, select adb_test.

  5. Enter the following SQL statement and click Save:

    If Automatic Save is enabled, the SQL statement is saved automatically.
    INSERT INTO finish_orders
    SELECT order_id, total_price, order_date
    FROM orders
    WHERE order_status = 'F';

Add the Large Order Generation node

This node filters high-value orders from finish_orders and inserts them into large_finish_orders.

  1. Drag another Single Instance SQL node onto the canvas.

  2. Rename it to Large Order Generation.

  3. Open the editor and select adb_test from the database drop-down list.

  4. Enter the following SQL statement and click Save:

    If Automatic Save is enabled, the SQL statement is saved automatically.
    INSERT INTO large_finish_orders
    SELECT order_id, total_price, order_date
    FROM finish_orders
    WHERE total_price > 10000;

Connect the two nodes

Hover over the Order Cleansing node. Click the circle on the right edge and drag a line to the Large Order Generation node. This creates a dependency: Large Order Generation runs only after Order Cleansing completes.

6

Step 3: Schedule and publish

  1. In the Task Flow Information section below the canvas, turn on the Enable Scheduling switch and configure the scheduling parameters.

    This example schedules the task to run daily at 01:00 from February 1, 2023 to February 28, 2023. Adjust the schedule to match your requirements. For all available scheduling parameters, see Configure scheduling.
  2. In the upper-left corner of the canvas, click Publish.

  3. In the Publish dialog box, enter any Remarks and click Publish.

  4. Verify the task flow is published: click Go to O&M in the upper-right corner and check the Released parameter.

    ValueMeaning
    PublishedThe task flow is active and will run on schedule
    Not publishedThe task flow was not successfully published

Monitor and manage task flows

After publishing, use the DMS O&M console to monitor and manage your task flows.

View run history and status

In the O&M console, you can view the execution history of each task flow. Use this to identify runs that failed or took longer than expected.

Handle task failures

If a task fails, the DMS O&M console provides O&M operations such as pausing, stopping, and rerunning tasks. To rerun or retry a task flow, go to the O&M console, locate the failed run, and select the appropriate action.

Configure notifications

To receive alerts when a task succeeds or fails, configure notifications in the task flow settings. DMS supports the following notification methods:

  • DingTalk

  • Text message

  • Email