AnalyticDB for MySQL allows you to asynchronously submit an import or export task.


When you execute INSERT INTO or INSERT OVERWRITE INTO SELECT to import or export data, the statement is synchronously executed by default. If hundreds of gigabytes of data is imported or exported, the connection between the client and the AnalyticDB for MySQL server may be closed, and the data import or export task fails. When a large amount of data is involved, we recommend that you asynchronously submit import or export tasks.

Asynchronously submit a task

  • Syntax
    submit job INSERT overwrite INTO table1 SELECT ...FROM table2;

    After you execute the preceding statement, a job ID is returned.

  • Example
    submit job INSERT overwrite INTO test SELECT * FROM test_external_table;
    | job_id                                |
    | 2017112122202917203100908203303000715 |
  • Priority-based scheduling

    AnalyticDB for MySQL V3.1.3.6 and later support priority-based scheduling. You can use a hint to specify the priority of an asynchronous task. The default priority value is 1. A larger value indicates a higher priority and will be prioritized by the system.

    /*+async_job_priority=10*/ submit job INSERT overwrite INTO test SELECT * FROM test_external_table;

Query the state of an asynchronous task

  • Syntax
    SHOW job STATUS WHERE job='job_id';
  • Example
    SHOW job STATUS WHERE job='2017112122202917203100908203303000715';
    The following information is returned:
    | job_id                                | schema_name | status  | fail_msg | create_time           | update_time           | definition                           |
    | 2017112122202917203100908203303000715 | test        | RUNNING | NULL     | 2017-11-21 22:20:31.0 | 2017-11-21 22:20:40.0 |  insert into test select * from test |
  • Task state
    • INIT: indicates that the task is being initialized.
    • RUNNING: indicates that the task is running.
    • FINISH or FAILED: indicates that the task is successful or fails.

Terminate a task

  • Syntax
    CANCEL job 'job_id';
  • Example
    CANCEL job '2017112122202917203100908203303000715';
  • Note
    • Unscheduled, failed, and successful tasks are removed from the queue.
    • If a running task is terminated, imported data is rolled back.