All Products
Search
Document Center

Data Management:Import data

Last Updated:Feb 29, 2024

If you want to import a large amount of data to a database, you can use the data import feature provided by Data Management (DMS) to import an SQL script, a CSV file, or an Excel file.

Prerequisites

The database is of one of the following types:

  • Relational databases

    • MySQL: ApsaraDB RDS for MySQL, PolarDB for MySQL, ApsaraDB MyBase for MySQL, PolarDB-X, AnalyticDB for MySQL, and MySQL databases from other sources

    • SQL Server: ApsaraDB RDS for SQL Server, ApsaraDB MyBase for SQL Server, and SQL Server databases from other sources

    • PostgreSQL: ApsaraDB RDS for PostgreSQL, PolarDB for PostgreSQL, ApsaraDB MyBase for PostgreSQL, AnalyticDB for PostgreSQL, and PostgreSQL databases from other sources

    • MariaDB: ApsaraDB RDS for MariaDB TX and MariaDB databases from other sources

    • OceanBase

    • PolarDB for PostgreSQL(Compatible with Oracle)

    • Dameng (DM)

    • Db2

    • Oracle

  • Non-relational databases

    • Redis

    • MongoDB

Usage notes

  • The data import feature does not allow you to change data for multiple databases at a time. To change data for multiple databases at a time, submit a Normal Data Modify ticket. For more information, see Perform regular data change.

  • If you want to change only a small amount of data, we recommend that you submit a Normal Data Modify or Lockless change ticket to ensure stable data change.

  • We recommend that you use SQL statements that have better performance to import a large amount of data, such as the INSERT, UPDATE, and DELETE statements. The UPDATE and DELETE statements must use indexes of primary keys.

  • If the SQL script that you use to import a large amount of data contains SQL statements used to change a schema, the table may be locked due to schema change, even if the lock-free schema change feature is enabled.

Procedure

Note

In the following example, an ApsaraDB RDS for MySQL instance that is managed in Security Collaboration mode is used to describe the configuration procedure.

  1. Log on to the DMS console V5.0.
  2. In the top navigation bar, choose Database Development > Data Change > Data Import.

    Note

    If you use the DMS console in simple mode, move the pointer over the 2022-10-21_15-25-22.png icon in the upper-left corner of the DMS console and choose All functions > Database Development > Data Change > Data Import.

  3. On the Data Change Tickets page, configure the ticket parameters. The following table describes the key parameters.

    Parameter

    Description

    Database

    The destination database to which you want to import data. You can select only one database at a time.

    Note

    If the destination database is managed in Flexible Management or Stable Change mode, make sure that you have logged on to the database. Otherwise, the database is not displayed for selection.

    Execution Method

    The way in which you want the ticket to be run. Valid values:

    • After Audit Approved, Order Submitter Execute

    • After Audit Approved, Auto Execute

    • Last Auditor Execute

    File Encoding

    The encoding that you want to use for the destination database. Valid values:

    • UTF-8

    • GBK

    • ISO-8859-1

    • Automatic Identification

    Import Mode

    The import mode. Valid values:

    • Speed Mode: In the Execute step, the SQL statements in the uploaded file are read and directly executed to import data to the destination database. The speed mode is less secure but faster than the security mode.

      Note

      By default, Speed Mode is disabled for a database instance based on security rules. You can enable the speed mode by performing the following steps: Go to the Details page of the security rule set that is applied to the database instance and click the SQL Correct tab. In the list below the Basic Configuration Item checkpoint, find the "Whether data import supports selecting speed mode" rule and click Edit in the Actions column. In the dialog box that appears, turn on Configuration Value.

    • Security mode: In the Precheck step, the uploaded file is parsed, and the SQL statements or CSV file data in the uploaded file is cached. In the Execute step, the cached SQL statements are read and executed to import data, or the cached CSV file data is read and imported to the destination database. The security mode is more secure but slower than the speed mode.

    File Type

    The format of the file for the data import. Valid values:

    • SQL Script: By default, you can use only the INSERT and REPLACE statements to import data to database instances that are managed in Security Collaboration mode. If you want to use other SQL statements to import data, modify the security rules for data import as a database administrator (DBA) or DMS administrator. You can modify the security rules by performing the following operations: Go to the Details page of the security rule set that is applied to the database instance and click the SQL Correct tab. In the list below the Batch Data import rules checkpoint, modify the security rules based on your business requirements.

    • CSV: The delimiters in the file must be commas (,).

    • Excel: The file can contain table headers and data, or contain only data. Table headers contain the attributes of data.

    Target Table Name

    The destination table to which data is to be imported.

    Note

    This parameter is displayed after you select CSV or Excel as File Type.

    Data Location

    The type of the file for data import. Valid values:

    • 1st behavioral attributes: The first row of the table contains field names.

    • 1st behavioral data: The first row of the table contains data.

    Write mode

    The mode that you want to use to write the imported data to the destination table. Valid values:

    • INSERT: The database checks the primary key when data is written. If a duplicate primary key value exists, an error message is returned.

    • INSERT_IGNORE: If the imported data contains data records that are the same as those in the destination table, the data records in the imported data are ignored.

    • REPLACE_INTO: If the imported data contains a row that has the same value for the primary key or unique index as a row in the destination table, the original row that contains the primary key or unique index is deleted and the new row is inserted.

    Note

    You can use the INSERT INTO, INSERT IGNORE, or REPLACE INTO statement to write data to ApsaraDB RDS for MySQL, PolarDB for MySQL, PolarDB-X, AnalyticDB for MySQL, and ApsaraDB for OceanBase databases. You can use only the INSERT INTO statement to write data to other databases.

    Attachment

    The file for the data import. Click File to upload a file.

    Note
    • Supported file formats: SQL, CSV, TXT, XLSX, and ZIP.

    • The file can be up to 5 GB in size.

    Other Options

    Optional. Specifies whether to skip errors.

    • By default, the check box is cleared. If an error occurs, DMS stops executing SQL statements and returns an error message.

    • If you select the check box, DMS skips errors and continues to execute SQL statements.

    SQL Statements for Rollback

    • Text: the SQL statements for rolling back the data import operation. Enter the SQL statements in the SQL Text field.

    • Attachment: the SQL file for rollback. Upload the SQL file.

      Note
      • Supported file formats: SQL, TXT, and ZIP.

      • The file can be up to 15 MB in size.

    Change Stakeholder

    Optional. The stakeholders involved in the data import. All specified stakeholders can view the ticket details and take part in the approval process. Other users aside from DMS administrators and DBAs are not allowed to view the ticket details.

  4. Click Submit and wait until the precheck is complete. If the ticket fails the precheck, troubleshoot the issue and click Retry.

    Note
    • If the uploaded file is an SQL script, DMS prechecks the SQL statements in the uploaded file. If the uploaded file is a CSV file, DMS generates INSERT statements based on the uploaded file.

    • If an error is reported during the type check of the Precheck step, adjust the security rule set that is applied to the database instance based on the error message. For more information, see Data change.

  5. In the Approval step, click Submit for Approval. In the Prompt message, click OK.

  6. After the ticket is approved, click Execute Change in the Execute step.

  7. In the Task Settings dialog box, specify the time to run the task.

    You can use one of the following methods to run the task:

    • Running immediately: By default, this option is selected. If you select this option, the task is immediately run after you click Confirm Execution.

    • Schedule: If you select this option, you must specify the start time for the task. After you click Confirm Execution, the task is run at the specified point in time.

    Note
    • During the execution, DMS reads the SQL statements in streaming mode and executes the SQL statements in batches. Each batch of SQL statements is 1 MB in size.

    • In the Execute step, you can view the execution status, SQL check details, and scheduling logs of the task.

    • If you want to restart a task that is suspended, the task is run from the beginning or the offset of suspension based on the import mode.

      • Speed mode: If the task is suspended and restarted, the SQL script is executed, or the data files are imported from the beginning.

      • Security mode: If the task is suspended and restarted, the SQL script is executed, or the data files are imported from the offset of suspension.

  8. Wait until a message appears, indicating that the data is imported.

    After the import is complete, you can go to the SQL Console tab of the destination database to query the imported data. For more information, see Manage a database on the SQLConsole tab.

Sample files