All Products
Search
Document Center

Data Management:Import data

Last Updated:Sep 21, 2023

The data import feature of Data Management (DMS) allows you to import a large amount of data to databases by uploading attachments. This feature helps you save labor costs and resources.

Prerequisites

One of the following types of databases is used:

  • Relational databases

    • MySQL: ApsaraDB RDS for MySQL, PolarDB for MySQL, MyBase for MySQL, PolarDB for Xscale, AnalyticDB for MySQL, and MySQL databases from other sources

    • SQL Server: ApsaraDB RDS for SQL Server, MyBase for SQL Server, and SQL Server databases from other sources

    • PostgreSQL: ApsaraDB RDS for PostgreSQL, PolarDB for PostgreSQL, MyBase for PostgreSQL, AnalyticDB for PostgreSQL, and PostgreSQL databases from other sources

    • MariaDB: ApsaraDB for MariaDB and MariaDB databases from other sources

    • OceanBase

    • PolarDB for PostgreSQL(Compatible with Oracle)

    • DM

    • Db2

    • Oracle

  • Non-relational databases

    • ApsaraDB for Redis

    • ApsaraDB for MongoDB

Usage notes

  • The data import feature does not allow you to change data for multiple databases at a time. To change data for multiple databases at a time, submit a Normal Data Modify ticket. For more information, see Change regular data.

  • If you want to change only a small amount of data, we recommend that you submit a Normal Data Modify or Lockless change ticket to ensure stable data change.

  • We recommend that you use SQL statements that have better performance to import a large amount of data, such as the INSERT, UPDATE, and DELETE statements. Indexes of primary keys are used in the UPDATE and DELETE statements.

  • If the SQL script that you use to import a large amount of data contains SQL statements used to change a schema, the table may be locked due to schema change, even if the lock-free schema change feature is enabled.

Procedure

Note

In the following example, an ApsaraDB RDS for MySQL instance that is managed in Security Collaboration mode is used to describe the configuration procedure.

  1. Log on to the DMS console V5.0.
  2. In the top navigation bar, choose Database Development > Data Change > Data Import.

    Note

    If you use the DMS console in simple mode, move the pointer over the 2022-10-21_15-25-22.png icon in the upper-left corner of the DMS console and choose All functions > Database Development > Data Change > Data Import.

  3. On the Data Change Tickets page, configure the parameters that are described in the following table.

    Parameter

    Description

    Database

    The destination database to which data is to be imported. You can select only one database at a time.

    Note

    If the destination database is managed in Flexible Management or Stable Change mode, make sure that you have logged on to the database. Otherwise, the database is not displayed for selection.

    Execution Method

    The way in which you want the ticket to be run. Valid values:

    • After Audit Approved, Order Submitter Execute

    • After Audit Approved, Auto Execute

    • Last Auditor Execute

    File Encoding

    The file encoding type that you want to use for the destination database. Valid values:

    • UTF-8

    • GBK

    • ISO-8859-1

    • Automatic Identification

    Import mode

    The import mode. Valid values:

    • Speed Mode: In the Execute step, the SQL statements in the uploaded file are read and directly executed to import data to the specified destination database. The speed mode is less secure but faster than the security mode.

      Note

      By default, the speed mode is disabled for a database instance based on security rules. You can enable the speed mode by performing the following operations: Go to the Details page of the security rule set that is applied to the database instance and click the SQL Correct tab. In the list below the Basic Configuration Item checkpoint, find the "Whether data import supports selecting speed mode" rule and click Edit in the Actions column. In the dialog box that appears, turn on Configuration Value.

    • Security mode: In the Precheck step, the uploaded file is parsed, and the SQL statements or CSV file data in the uploaded file is cached. In the Execute step, the cached SQL statements are read and executed to import data, or the cached CSV file data is read and imported to the specified destination database. The security mode is more secure but slower than the speed mode.

    File type

    The format of the file for the data import. Valid values:

    • SQL Script: By default, you can use only the INSERT and REPLACE statements to import data to database instances that are managed in Security Collaboration mode. If you want to use other SQL statements to import data, modify the security rules for data import as a database administrator (DBA) or DMS administrator. You can modify the security rules by performing the following operations: Go to the Details page of the security rule set that is applied to the database instance and click the SQL Correct tab. In the list below the Batch Data import rules checkpoint, modify the security rules based on your business requirements.

    • CSV: The delimiters in the file must be commas (,).

    • Excel: The file can contain table headers and data, or contain only data. Table headers contain the attributes of data.

    Target Table Name

    The destination table to which data is to be imported.

    Data Location

    The type of the file for the data import. Valid values:

    • 1st behavioral attributes: The first row of the table contains field names.

    • 1st behavioral data: The first row of the table contains data.

    Write mode

    The mode that you want to use to write the imported data to the destination table. Valid values:

    • INSERT: The database checks the primary key when data is written. If a duplicate primary key value exists, an error message is returned.

    • INSERT_IGNORE: If the imported data contains data records that are the same as those in the destination table, the data records in the imported data are ignored.

    • REPLACE_INTO: If the imported data contains a row that has the same value for the primary key or unique index as a row in the destination table, the original row that contains the primary key or unique index is deleted, and the new row is inserted.

    Note

    You can use the INSERT INTO, INSERT IGNORE, or REPLACE INTO statement to write data to ApsaraDB RDS for MySQL, PolarDB for MySQL, PolarDB for Xscale, AnalyticDB for MySQL, and ApsaraDB for OceanBase databases. You can use only the INSERT INTO statement to write data to other databases.

    Attachment

    The file for the data import. Click Upload a file to upload a file.

    Note
    • Supported file types: SQL, CSV, TXT, XLSX, and ZIP.

    • The maximum size of the file to be uploaded is 5 GB.

    Other Options

    Optional. This parameter specifies whether to skip errors.

    • By default, the check box is cleared. If an error occurs, DMS stops executing SQL statements and sends an error message.

    • If you select the check box, DMS skips errors and continues to execute SQL statements.

    SQL Statements for Rollback

    • Optional. Valid values: Text: the SQL statements for rolling back the data import operation. Enter the SQL statements in the SQL Text field.

    • Attachment: the SQL file for rolling back the data import operation. Upload the SQL file.

      Note
      • Supported file types: SQL, TXT, and ZIP.

      • The maximum size of the file is 15 MB.

    Change Stakeholder

    Optional. The stakeholders involved in the data import. All specified stakeholders can view the ticket details and take part in the approval process. Other users aside from DMS administrators and DBAs are not allowed to view the ticket details.

    Attachments

    Optional. The images or files that contain other necessary information about the ticket.

  4. Click Submit and wait until the precheck is complete. If the ticket fails the precheck, troubleshoot the issue and click Retry.

    Note
    • If the uploaded file is an SQL script, DMS prechecks the SQL statements in the uploaded file. If the uploaded file is a CSV file, DMS generates INSERT statements based on the uploaded file.

    • If an error is reported during the type check of the Precheck step, adjust the security rule set that is applied to the database instance based on the error message. For more information, see Data change.

  5. In the Approval step, click Submit for Approval. In the Prompt message, click OK.

  6. Optional. In the Execute step, click Execute Change.

  7. In the Task Settings dialog box, specify the time to run the task.

    You can use one of the following methods to run the task:

    • Running immediately: By default, this option is selected. Click Confirm Execution to run the task immediately.

    • Schedule: If you select this option, you must specify the start time for the task. Click Confirm Execution to run the task at the specified time.

    Note
    • During the execution, DMS reads the SQL statements in streaming mode and executes the SQL statements in batches. Each batch of SQL statements is 1 MB in size.

    • In the Execute step, you can view the status, SQL check details, and scheduling logs of the task.

    • If you want to restart a task that is suspended, the task is run from the beginning or the offset of suspension based on the import mode.

      • Speed mode: If the task is suspended and restarted, the SQL script is executed, or the data files are imported from the beginning.

      • Security mode: If the task is suspended and restarted, the SQL script is executed, or the data files are imported from the offset of suspension.

  8. Optional. After the task is complete, you can open the SQL Console tab of the destination database and query the imported data.

Sample files