Data Management (DMS) allows you to import large amounts of data to a database in a quick manner. This helps you save manpower and resources.

Supported database types

  • RDS: ApsaraDB RDS for MySQL, ApsaraDB RDS for MariaDB, ApsaraDB RDS for SQL Server, ApsaraDB RDS for PostgreSQL, and ApsaraDB RDS for PPAS
  • PolarDB: PolarDB for MySQL, PolarDB for PostgreSQL, and PolarDB-O
  • PolarDB-X
  • OceanBase
  • AnalyticDB: AnalyticDB for MySQL and AnalyticDB for PostgreSQL
  • Self-managed databases: MySQL, SQL Server, and PostgreSQL databases
Note
  • You can use the INSERT INTO, INSERT IGNORE, or REPLACE INTO statement to write data to the MySQL, PolarDB for MySQL, OceanBase, AnalyticDB for MySQL, and PolarDB-X databases.
  • You can use only the INSERT INTO statement to write data to other databases.

Background information

  • Scenarios
    • Initialize application data.
    • Import data to DMS in the event of emergencies, for example, import backup data to roll back a misoperation.
  • Supported file formats
    • TXT files. Each TXT file for importing data can be up to 5 GB in size.
    • SQL scripts. Each SQL script for importing data can be up to 5 GB in size. By default, you can use only INSERT and REPLACE statements to import data to database instances that are managed in Security Collaboration mode. If you want to use other SQL statements to import data, modify the security rules for data import as a database administrator (DBA) or DMS administrator. To modify the security rules, click the SQL Correct tab on the Security Rules tab and set the Checkpoints parameter to Batch Data import rules.
    • CSV files. Values in a CSV file must be separated by commas (,). The first row must be field names.

Usage notes

  • If you need to use SQL statements to import only a small amount of data, we recommend that you submit a Normal Data Modify or Lockless change ticket.
  • If you submit a Large Data Import ticket to import a large amount of data to a table, the table will be locked even if you set the OnlineDDL parameter to Open (DMS OnlineDDL first) for the database instance.
  • We recommend that you use SQL statements with better performance to import a large amount of data, such as INSERT, UPDATE, and DELETE statements. Indexes of primary keys are used in the UPDATE and DELETE statements.

Procedure

  1. Log on to the DMS console.
  2. In the top navigation bar, move the pointer over the More icon and choose Data Plans > Data Import.
  3. On the Large Data Import tab, set the parameters that are described in the following table.
    Parameter Description
    Database Required. The destination database to which data is imported.
    Note If the destination database is managed in an instance in Flexible Management or Stable Change mode, make sure that you have logged on to the database. If not, the database is not displayed for you to select.
    Reason Category Required. The reason for the data import.
    Business Background Required. The purpose or objective of the data import.
    File Encoding Required. The encoding algorithm to be used by the destination database. Valid values:
    • UTF-8
    • GBK
    • ISO-8859-1
    • Automatic Identification
    Import mode The import mode. Valid values:
    • Speed Mode: In the Execute step, the uploaded file is read and SQL statements are executed to import data to the specified destination database. Compared with the security mode, this mode can be used to import data in a less secure but more efficient manner.
    • Security mode: In the Precheck step, the uploaded file is parsed and SQL statements or CSV file data is cached. In the Execute step, the cached SQL statements are read and executed to import data or the cached CSV file data is read and imported to the specified destination database. This mode can be used to import data in a more secure but slower manner.
    Note For a database instance that is managed in Security Collaboration mode, the speed mode is disabled in the security rules for data import by default. To enable the speed mode, click the SQL Correct tab on the Security Rules tab and set the Checkpoints parameter to Basic Configuration Item.
    File type Required. The format of the file for data import. You can set this parameter to SQL Script or CSV as required. Then, upload the file.
    Note
    • For more information about sample SQL scripts and CSV files, see Sample files.
    • If you set this parameter to CSV, set the Target Table Name, Data Location, and Write mode parameters as required.
    Target Table Name The destination table to which the data in the CSV format is imported.
    Data Location The type of the CSV file. Valid values:
    • 1st behavioral attributes: The first row of the CSV file contains field names.
    • 1st behavioral data: The first row of the CSV file contains data.
    Note For more information about sample CSV files, see Sample files.
    Write mode The mode that is used to write the data in the CSV format to the destination table. Valid values:
    • INSERT
    • INSERT_IGNORE
    • REPLACE_INTO
    Other Options Specifies whether to skip errors.
    Note
    • By default, the check box is cleared. If an error occurs, DMS stops executing SQL statements and sends an error message.
    • If you select the check box, DMS skips errors and continues to execute SQL statements.
    SQL Statements for Rollback
    • Optional. Valid values: Text: the SQL statements for rolling back the data import operation. Enter the SQL statements in the field.
    • Attachment: the file that contains the SQL statements for rolling back the data import operation. You can upload a TXT file or an SQL script. The file can be up to 15 MB in size.
    Attachments Optional. The images or files that contain other necessary information for the ticket.
    Change Stakeholder Optional. The stakeholders of the data import. All specified stakeholders can view the ticket details and assist in the approval process. Irrelevant users other than DMS administrators and DBAs are not allowed to view the ticket details.
  4. Click Submit and wait until the precheck is complete. If the ticket fails the precheck, troubleshoot the issue and submit the ticket again.
    Note If the uploaded file is an SQL script, DMS prechecks the SQL statements in the uploaded file. If the uploaded file is a CSV file, DMS generates INSERT statements based on the uploaded file.
  5. Submit the ticket for approval.
  6. After the ticket is approved, click Execute Change in the Execute step.
  7. In the Task Settings dialog box, set the Execute Immediately parameter. If you do not want the task to be immediately run, set the Execute Immediately parameter to Schedule and specify a point in time for DMS to run the task. Then, click Confirm Execution.
    Note During the execution, DMS reads the SQL statements in streaming mode and executes the SQL statements in batches. Each batch of SQL statements is 1 MB in size.

Sample files