All Products
Search
Document Center

AnalyticDB for MySQL:Use DMS to schedule XIHE SQL tasks

Last Updated:Sep 12, 2023

AnalyticDB for MySQL Data Warehouse Edition (V3.0) and Data Lakehouse Edition (V3.0) allow you to use the task orchestration feature of Data Management (DMS) to orchestrate, schedule, manage, and monitor AnalyticDB for MySQL tasks. This topic describes how to use DMS to develop and schedule tasks.

Background information

  • Challenges and demands: event scheduling

    Conventional event schedulers such as MySQL Event Scheduler provide powerful scheduling capabilities for database events. However, these event schedulers have the following limits:

    • High technical requirements: You must be familiar with the syntax of the SQL statements that are used to define events, such as CREATE EVENT and ALTER EVENT. You cannot define events by performing simple configurations on a GUI.

    • Dependency on the database engine: The engine must support the event scheduler. The event scheduler must be enabled.

    • Applicability only to a single database: Scheduling can be performed only for a single database. You cannot use a conventional event scheduler to schedule events across databases or collaborate with other tools.

    • Difficulty in tracking: You cannot view the execution status, history, or duration of a scheduled event.

    • Difficulty in O&M: You cannot pause or rerun an event. You cannot restore an event after the event fails to be executed.

    • Inability to send notifications: Conventional event schedulers cannot inform you whether an event is executed or fails to be executed by text message or email.

  • Solution: DMS task orchestration

    The task orchestration feature of DMS can resolve these issues. This feature provides an external orchestration and scheduling system that is independent of the event scheduling capabilities of database engines. The task orchestration feature provides the following benefits:

    • Supports multiple types of database engines and database tools. For example, the database engines include MySQL, Oracle, PostgreSQL, and SQL Server. The database tools provide features such as data migration, data backup, and cross-database analysis. This way, you can schedule events across databases and services.

    • Provides a GUI. You can create scheduling tasks by dragging nodes and performing simple configurations on the GUI.

    • Provides a variety of notification channels, such as emails, text messages, and DingTalk notifications.

    • Supports various O&M operations, such as Pause, Exits, and Rerun.

Sample data

In this example, a database named adb_test is created. Three tables named orders, finish_orders, and large_finish_orders are created in the database.

create database adb_test;
create table orders(
  order_id bigint not null comment'order ID',
  order_status varchar not null comment'order state',
  total_price decimal(15,2) not null comment'total price',
  order_date date not null comment'orderdate',
  PRIMARY KEY (order_id)
  );
 create table finish_orders(
  order_id bigint not null comment'order ID',
  total_price decimal(15,2) not null comment'order state',
  order_date date not null comment'total price',
  PRIMARY KEY (order_id)
  ); 
  create table large_finish_orders(
  order_id bigint not null comment'order ID',
  total_price decimal(15,2) not null comment'order state',
  order_date date not null comment'total price',
  PRIMARY KEY (order_id)
  ); 

Task orchestration process

The following table provides an example on how to perform job scheduling on an AnalyticDB for MySQL Data Warehouse Edition (V3.0) cluster. The task orchestration feature of DMS is used to filter completed orders whose total price is greater than USD 10,000 from the orders table of an AnalyticDB for MySQL database.

Step

Description

Step 1: Create a task flow

Create a task flow.

Step 2: Create a task node

Create the following task nodes in the created task flow:

  • Order Cleansing: filters completed orders from the orders table by using the order_status=F filter condition and writes the results to the finish_orders table.

  • Large Order Generation: filters orders whose total price is greater than USD 10,000 from the finish_orders table by using the total_price>10000 filter condition and writes the results to the large_finish_orders table.

Step 3: Configure task flow scheduling

Start the task and make it periodically run.

Step 1: Create a task flow

  1. Log on to the AnalyticDB for MySQL console.
  2. In the upper-left corner of the page, select a region.
  3. In the left-side navigation pane, click Clusters.
  4. On the Data Warehouse Edition (V3.0) tab, find the cluster that you want to manage and click the Cluster ID.
  5. In the left-side navigation pane, choose End-to-end Data Management > Data Assets.

    Note
    • The first time you log on to an AnalyticDB for MySQL database from the DMS console and use the data asset management feature, you must first configure the database logon information. For more information, see Log on to a database.

    • Assume that you have logged on to an AnalyticDB for MySQL database whose control mode is set to Flexible Management or Stable Change but did not select Remember password. When you log on to the database again, you must enter the password of the database account before you can perform subsequent operations. For more information, see Control modes.

    • The end-to-end data management feature is not supported for the AnalyticDB for MySQL clusters that reside in the Indonesia (Jakarta) region.

  6. In the top navigation bar, click DTS.

  7. In the left-side navigation pane, choose Data Development > Task Orchestration.

  8. Create a task flow. In this example, the task flow name is Order Filtering.

    1. On the Task orchestration page, click Create Task Flow.

    2. In the Create Task Flow dialog box, set the Task Flow Name and Description parameters, and then click OK.

Step 2: Create a task node

  1. On the Order Filtering tab, create the following task nodes:

    • Order Cleansing

      1. In the left-side navigation pane, choose Data Processing Nodes > Single Instance SQL and drag Single Instance SQL to the right-side canvas.

      2. Select the added task node and click the 1 icon to rename the task node as Order Cleansing.

      3. Double-click the task node or click the 2 icon to edit the task node.

      4. Select a database from the Database drop-down list.

      5. In the code editor, enter the following task statement and click Save:

        insert into finish_orders
         select order_id,total_price,order_date
         from orders 
         where order_status = 'F';
        Note

        If you have selected Automatic Save, the SQL statement is automatically saved.

    • Large Order Generation

      1. In the left-side navigation pane, choose Data Processing Nodes > Single Instance SQL and drag Single Instance SQL to the right-side canvas.

      2. Select the added task node and click the 1 icon to rename the task node as Large Order Generation.

      3. Double-click the task node or click the 2 icon to edit the task node.

      4. Select a database from the Database drop-down list.

      5. In the code editor, enter the following task statement and click Save:

        insert into large_finish_orders
         select order_id,total_price,order_date
         from finish_orders 
         where total_price > 10000;
        Note

        If you have selected Automatic Save, the SQL statement is automatically saved.

  2. On the task flow canvas, move the pointer over the Order Cleansing node, click and hold the small circle on the right side of the node, and then draw a line from the circle to the Large Order Generation node. This way, the two task nodes are connected to form a task flow.

    6

Step 3: Configure task flow scheduling

  1. In the Scheduling Settings section below the task flow canvas, turn on Enable Scheduling and configure related parameters.

    Note

    In this example, the scheduling task is set to run on a periodic basis at 01:00 every day from 2023-02-01 to 2023-02-28. You can also customize the scheduling task based on your business needs. For more information about scheduling parameters, see the "Configure scheduling properties for the task flow" section of the Configure a task flow topic.

  2. Publish the task flow.
    1. In the upper-left corner of the canvas, click Publish.
    2. In the Publish dialog box, set the Remarks parameter and click Publish.
  3. View the status of the task flow.
    1. In the upper-right corner of the canvas, click Go to O&M.
    2. On the right side of the page, check whether the task flow is published.
      • Published: The task flow is published.
      • Not published: The task flow is not published.