Data Management (DMS) allows you to import large amounts of data to a database by using an attachment in a quick manner. This helps you save manpower and resources.

Supported database types

  • MySQL: ApsaraDB RDS for MySQL, PolarDB for MySQL, MyBase for MySQL, PolarDB-X, AnalyticDB for MySQL, and MySQL databases that are not on Alibaba Cloud
  • SQL Server: ApsaraDB RDS for SQL Server, MyBase for SQL Server, and SQL Server databases that are not on Alibaba Cloud
  • PostgreSQL: ApsaraDB RDS for PostgreSQL, PolarDB for PostgreSQL, MyBase for PostgreSQL, AnalyticDB for PostgreSQL, and PostgreSQL databases that are not on Alibaba Cloud
  • MariaDB: ApsaraDB for MariaDB TX and MariaDB databases that are not on Alibaba Cloud
  • OceanBase
  • PolarDB for Oracle
  • DM

Usage notes

  • If you need to use SQL statements to import only a small amount of data, we recommend that you submit a Normal Data Modify or Lockless change ticket.
  • We recommend that you use SQL statements that have better performance to import a large amount of data, such as the INSERT, UPDATE, and DELETE statements. Indexes of primary keys are used in the UPDATE and DELETE statements.
  • If the SQL script that you use to import a large amount of data contains SQL statements used to change a schema, the table may be locked due to schema change even if the lock-free DDL feature is enabled.

Procedure

  1. Go to the DMS console V5.0.
  2. In the top navigation bar, click Database Development. In the left-side navigation pane, choose Data Change > Data Import.
  3. On the Data Change Ticket Application page, set the parameters that are described in the following table.
    Parameter Description
    Database The destination database to which data is to be imported.
    Note If the destination database is managed in a database instance in Flexible Management or Stable Change mode, make sure that you have logged on to the database. If not, the database is not displayed for you to select.
    Associated Iterations Optional. The iteration that you want to associate with the data import. You can click Select/Change Iteration to select an iteration.
    Reason Category The reason for the data import.
    Business Background The purpose or objective of the data import. Enter a detailed description to reduce unnecessary communication.
    Execution Method The way in which you want the ticket to be run. Valid values:
    • After Audit Approved, Order Submitter Execute
    • After Audit Approved, Auto Execute
    • Last Auditor Execute
    File Encoding The encoding algorithm to be used by the destination database. Valid values:
    • UTF-8
    • GBK
    • ISO-8859-1
    • Automatic Identification
    Import mode The import mode. Valid values:
    • Speed Mode: In the Execute step, the SQL statements in the uploaded file are read and directly executed to import data to the specified destination database. Compared with the security mode, this mode can be used to import data in a less secure but more efficient manner.
      Note By default, the speed mode is disabled for a database instance that is managed in Security Collaboration mode. You can enable the speed mode by performing the following operations: Go to the Details page of the security rule set that applies to the database instance and click the SQL Correct tab. In the list under the Basic Configuration Item checkpoint, modify the Whether data import supports selecting speed mode configuration.
    • Security mode: In the Precheck step, the uploaded file is parsed, and the SQL statements or CSV file data in the uploaded file is cached. In the Execute step, the cached SQL statements are read and executed to import data, or the cached CSV file data is read and imported to the specified destination database. This mode can be used to import data in a more secure but less efficient manner.
    File type The format of the file for data import. Valid values:
    • SQL Script: By default, you can use only the INSERT and REPLACE statements to import data to database instances that are managed in Security Collaboration mode. If you want to use other SQL statements to import data, modify the security rules for data import as a database administrator (DBA) or DMS administrator. You can modify the security rules by performing the following operations: Go to the Details page of the security rule set that applies to the database instance and click the SQL Correct tab. In the list under the Batch Data import rules checkpoint, modify the security rules as needed.
    • CSV: The delimiters in the file must be commas (,).
    • Excel: The file can contain field names as headers and data, or contain only data.
    Target Table Name The destination table to which data is to be imported.
    Data Location The type of the file for data import. Valid values:
    • 1st behavioral attributes: The first row of the table contains field names.
    • 1st behavioral data: The first row of the table contains data.
    Write mode The mode that you want to use to write the imported data to the destination table. Valid values:
    • INSERT: The database checks the primary key when data is written. If a duplicate primary key value exists, an error message is returned.
    • INSERT_IGNORE: If the imported data contains data records that are the same as those in the destination table, the new data records are ignored.
    • REPLACE_INTO: If the imported data contains a row that has the same value for the primary key or unique index as one in the destination table, the database deletes the existing row and inserts the new one into the destination table.
    Note You can use the INSERT INTO, INSERT IGNORE, or REPLACE INTO statement to write data to ApsaraDB RDS for MySQL, PolarDB for MySQL, PolarDB-X, AnalyticDB for MySQL, and ApsaraDB for OceanBase databases. You can use only the INSERT INTO statement to write data to other databases.
    Attachment The file for data import. Click Upload a file to upload a file.
    Note
    • Supported file types include SQL, CSV, TXT, XLSX, and ZIP.
    • The uploaded file can be up to 5 GB in size.
    Other Options Optional. Specifies whether to skip errors.
    • By default, the check box is cleared. If an error occurs, DMS stops executing SQL statements and sends an error message.
    • If you select the check box, DMS skips errors and continues to execute SQL statements.
    SQL Statements for Rollback
    • Optional. Valid values:
    • Text: the SQL statements for rolling back the data import operation. Enter the SQL statements in the SQL Text field.
    • Attachment: the SQL file for rolling back the data import operation.
      Note
      • Supported file types include SQL, TXT, and ZIP.
      • The file can be up to 15 MB in size.
    Change Stakeholder Optional. The stakeholders involved in the data import. All specified stakeholders can view the ticket details and assist in the approval process. Irrelevant users other than DMS administrators and DBAs are not allowed to view the ticket details.
    Attachments Optional. The images or files that contain other necessary information for the ticket.
  4. Click Submit and wait until the precheck is complete. If the ticket fails the precheck, troubleshoot the issue and submit the ticket again.
    Note If the uploaded file is an SQL script, DMS prechecks the SQL statements in the uploaded file. If the uploaded file is a CSV file, DMS generates INSERT statements based on the uploaded file.
  5. In the Approval step, click Submit for Approval. In the Prompt message, click OK.
  6. Optional: In the Execute step, click Execute Change.
  7. In the Task Settings dialog box, specify the time to run the task. You can set the Execution Strategy parameter to Running immediately. If you do not want the task to be immediately run, set the Execution Strategy parameter to Schedule and specify a point in time for DMS to run the task. Then, click Confirm Execution.
    • Running immediately: By default, this option is selected. Click Confirm Execution to run the task immediately.
    • Schedule: If you select this option, you must specify the start time for the task. Click Confirm Execution to run the task at the specified time.
    Note
    • During the execution, DMS reads the SQL statements in streaming mode and executes the SQL statements in batches. Each batch of SQL statements is 1 MB in size.
    • In the Execute step, you can view the status, SQL check details, and scheduling logs of the task.
    • After the task is complete, you can open the SQL Console tab of the destination database and query the imported data.

Sample files