All Products
Document Center

Use AnalyticDB for MySQL and DMS to generate reports on a regular basis

Last Updated: Jul 27, 2020

  AnalyticDB for MySQL is a popular real-time data warehousing service that can process petabytes of data with high concurrency. You can use it to analyze your enterprise data in an efficient, easy, and agile manner. However, operations are still complex when you use AnalyticDB for MySQL for recurring and complex data analysis, such as generating daily or hourly data reports. To resolve this issue, you can use Data Management Service (DMS) as a supplement. This topic describes how to use the task orchestration feature of DMS to create and schedule task flows for regularly analyzing data and generating data reports in AnalyticDB for MySQL. In this way, applications can obtain the data analysis results in a timely manner. This method has the following benefits:

  • You need to define a task flow only once. Tasks in the task flow can be run automatically and periodically. This greatly reduces the cost of manual operations.
  • Tasks are run periodically and analysis results are generated in real time. Business applications can query the analysis results in a timely manner.
  • You can specify the time to run a task flow as required. This avoids heavy loads on AnalyticDB for MySQL and takes full use of computing resources of AnalyticDB for MySQL.
  • The results generated when task flows are being run can be reused by other data analysis tasks. This maximizes the resource utilization of AnalyticDB for MySQL.
  • DMS supports data migration across various types of databases, such as AnalyticDB for MySQL databases and online transaction processing (OLTP) databases. This allows you to import historical data to AnalyticDB for MySQL and return the analysis results to online business databases.



  • DMS is purchased. An AnalyticDB for MySQL instance is purchased and an AnalyticDB for MySQL database is registered in DMS. For more information, see Instance management.
  • The business database that stores data to be analyzed by AnalyticDB for MySQL is registered in DMS.
  • You have the required permissions on the registered AnalyticDB for MySQL database and the business database. For example, you must have the change permission on the AnalyticDB for MySQL database to import data to the database. For more information, see Permission management.

Create a task flow for data development

This section describes how to create and configure a task flow for generating AnalyticDB for MySQL reports on the Task orchestration page of DMS.

Import business data

You can use one of the following methods to import business data to the AnalyticDB for MySQL database:

  • Method 1: Use the cross-database query feature of DMS

  DMS supports data migration so that you can migrate data from one or more source databases to the destination database. To create a Cross Database SQL task in a task flow to import data from the business database to the AnalyticDB for MySQL database, follow these steps:

  1. Log on to the DMS console. Create database links for the source business database and the destination AnalyticDB for MySQL database. For more information, see Cross-database query. In this example, create a database link named dblink_source for the source business database, and a database link named dblink_adb for the destination AnalyticDB for MySQL database.

  2. Go to the Task orchestration page. Create a task flow named daily_report and create a Cross Database SQL task named data_import in the task flow. task1

  3. Write the following SQL statement on the Content tab of the data_import task. DMS supports cross-database query. You can use only one INSERT INTO statement to migrate data from the source dblink_source database to the destination dblink_adb database. In the following statement, ${bizdate} is a system variable that specifies the data timestamp of the recurring task. By default, the data timestamp is one day before the task is run. For more information, see Task. The dt field specifies the date when the data is generated. With this field, you can migrate incremental data based on the date.

  1. insert into dblink_adb.schema2.table2 (dt, column1, column2, column3, xxx)
  2. as select dt, column1, column2, column3, xxx from dblink_source.schema1.table1
  3. where dt = '${bizdate}';
  • Method 2: Use the data synchronization feature of DTS

  Data Transmission Service (DTS) allows you to configure a data synchronization task for importing data from the source business database to the destination AnalyticDB for MySQL database in real time. You need to log on to the DTS console to configure the task. For more information, see DTS documentation. After you configure the data synchronization task in DTS, data in the source database is written to the destination database in real time. In this case, when you use a DMS task flow to generate reports, you do not need to create a Cross Database SQL task to import data.

Create tasks for generating reports

  After you complete the configuration for data import, you can create tasks for generating reports. Create a Single Instance SQL task in the target DMS task flow. On the Content tab of the task, select the destination AnalyticDB for MySQL database from the Database drop-down list. Then, AnalyticDB for MySQL is used to analyze the business data for the task. In this case, SQL statements are executed on AnalyticDB for MySQL. Therefore, you must follow the SQL syntax of AnalyticDB for MySQL.

  If the logic for generating reports is complex, you can implement it in multiple steps. In this case, create a task for each step and arrange the tasks based on the running order, as shown in the following figure. This method is common in data warehouse development. It simplifies the SQL statements of each task to minimize SQL syntax errors. With reasonable overall planning, the intermediate data of each task can be reused in collaborative data development that involves multiple users and tasks. This avoids repeated computing and saves resources.

Run the task flow

Set parameters for scheduling the task flow

  After you create the task flow, enable scheduling and set scheduling parameters for the task flow. For example, you can select the interval for scheduling the task flow as needed, such as Day. You can also specify the effective period for the schedule. We recommend that you run the task flow during off-peak hours to balance the load of the AnalyticDB for MySQL instance and make full use of the computing resources in the instance.

  After you configure the task flow, you can test the task flow by using the try run feature. If the try run result is not as expected, you can modify the task flow as needed.

View the running status of the task flow

  After you enable scheduling for the task flow, the system runs the task flow to generate reports as configured. To learn about the running status of the task flow, go to the Operation Center tab. On this tab, you can also filter failed task flows for troubleshooting.


Data warehouses are applicable to multiple scenarios, including generating reports on a regular basis as described in this topic. You can use AnalyticDB for MySQL and DMS to generate reports on a regular basis. AnalyticDB for MySQL is a data warehouse engine. It has powerful storage and computing capabilities. DMS provides the data development feature to connect AnalyticDB for MySQL databases and other types of databases. In this way, AnalyticDB for MySQL is no longer a data silo. You can import data from other databases to AnalyticDB for MySQL for analysis. This makes data management in enterprises more flexible, unified, and efficient. With the data development feature of DMS, you can use AnalyticDB for MySQL to extract data value in an easier and more efficient manner, driving business development and creating more opportunities.