All Products
Document Center

Set library migration

Last Updated: Apr 04, 2018

Full-database migration is a convenient tool that can improve user efficiency and reduce user cost by quickly uploading all the tables in a MySQL database to MaxCompute ALL at a time. For more information about full-database migration, see Full-database migration overview.

This article demonstrates how to migrate a full MySQL database to MaxCompute by using the full-database migration feature.


  1. Log on to the DTplus Data Integration console and click the Offline Sync > Data Sources tab in the left-side navigation pane to enter the data source management page, as shown in the following figure.


  2. Click New Source in the upper right corner to add a MySQL data source named clone_database for full-database migration, as shown in the following figure.


  3. Click Test Connectivity to verify that the data source can be accessed correctly, and click Complete to save the data source.

  4. After successful addition, the newly added MySQL data source clone_database is displayed in the data source list. Click Migration behind the MySQL data source to enter the full-database migration interface, as shown in the following figure.


    The full-database migration interface consists of three functional areas, as shown in the following figure.


    • Filter area of tables to be migrated: It lists all the database tables under the MySQL data source clone_database. You can select database tables to be migrated in batch.

    • Advanced Settings: It provides the conversion rules of table names, column names, and column types between MySQL and MaxCompute data tables.

    • Control area of the migration mode and concurrency: You can select the full-database migration mode (full or incremental) and the concurrency (batch upload or full upload) and check the progress of submitting the migration task.

  5. Click the Advanced Settings button to select conversion rules as needed. For example, the ods_ prefix is added for all the tables created in MaxCompute, as shown in the following figure.


  6. In the control area of the migration mode and concurrency, select Daily Incremental as the synchronization mode and set gmt_modified for the incremental field. Data Integration generates a where condition of incremental extraction for each task based on the selected incremental field by default and defines a daily data extraction condition by working with a DataWorks scheduling parameter such as ${bdp.system.bizdate}, as shown in the following figure.


    Data Integration extracts the table data from the MySQL database by connecting to the remote MySQL database through JDBC and running SELECT SQL statements. Since the statements are standard SQL extraction statements, you can set a WHERE clause to control the data range. In this case, the where condition of incremental extraction is as follows:

    1. STR_TO_DATE('${bdp.system.bizdate}', '%Y%m%d') <= gmt_modified AND gmt_modified < DATE_ADD(STR_TO_DATE('${bdp.system.bizdate}', '%Y%m%d'), interval 1 day)

    To protect the MySQL data source from being overloaded by too many data synchronization jobs started at the same point of time, Batch Upload can be selected. You can set to start synchronizing three database tables after every hour from 00:00 everyday. At last, click the Submit Task button, and you can view the migration progress and the status of the migration tasks for each table.

  7. Click the migration task for table a1 to jump to the task development page of Data Integration.

    The table ods_a1 in MaxCompute corresponding to the source table a1 is created successfully, and the column name and type also match the previously set conversion rules. In the left-side directory tree, you can view all the full-database migration tasks under the clone_database in the name format of mysql2odps_source table name.

So far, you have completed migrating the full MySQL data source clone_databae to MaxCompute. These tasks are scheduled to run according to the set scheduling cycle (daily scheduling by default). Also, you can transmit historical data by using the data completing feature of DataWorks. With the full-database migration of Data Integration, you can dramatically reduce the configuration for initial data migration to cloud and migration costs. The log for successful running of the full-database migration of a1 table is shown as follows: