All Products
Search
Document Center

Data Management:Import data

Last Updated:Oct 16, 2024

If you want to import a large amount of data as an SQL script, a CSV file, or an Excel file to a database, you can use the data import feature provided by Data Management (DMS).

Prerequisites

  • The database is of one of the following types:

    • MySQL: ApsaraDB RDS for MySQL, PolarDB for MySQL, ApsaraDB MyBase for MySQL, PolarDB for Xscale (PolarDB-X), AnalyticDB for MySQL, and MySQL databases from other sources

    • SQL Server: ApsaraDB RDS for SQL Server, ApsaraDB MyBase for SQL Server, and SQL Server databases from other sources

    • PostgreSQL: ApsaraDB RDS for PostgreSQL, PolarDB for PostgreSQL, ApsaraDB MyBase for PostgreSQL, AnalyticDB for PostgreSQL, and PostgreSQL databases from other sources

    • MariaDB: ApsaraDB RDS for MariaDB and MariaDB databases from other sources

    • OceanBase

    • PolarDB for PostgreSQL (Compatible with Oracle)

    • Dameng (DM)

    • Db2

    • Oracle

    • Redis

    • MongoDB

  • You have logged on to the database to which data is to be imported in DMS. For more information, see Log on to a database instance.

  • You have the change permissions on the destination database.

Usage notes

  • The SQL script, CSV file, or Excel file to be imported in a single ticket can be up to 5 GB in size.

  • DMS allows you to upload a ZIP file as an attachment. If you want to import multiple files by submitting a single ticket, you can package the files into a ZIP file for upload.

  • The data import feature does not allow you to change data for multiple databases at a time. To change data for multiple databases at a time, submit a Normal Data Modify ticket. For more information, see Normal data modify.

  • If you want to change only a small amount of data, we recommend that you submit a Normal Data Modify or Lockless Change ticket to ensure stable data change.

  • If the SQL script that you use to import a large amount of data contains SQL statements used to change a schema, the table may be locked due to schema change even if the lock-free schema change feature is enabled.

  • To prevent database performance from being compromised when a large amount of data is imported to the database, we recommend that you use SQL statements that are more performance-efficient, such as the INSERT, UPDATE, and DELETE statements. The UPDATE and DELETE statements must use primary key indexes.

Procedure

Note

In this topic, an ApsaraDB RDS for MySQL instance that is managed in Security Collaboration mode is used to describe the configuration procedure.

  1. Log on to the DMS console V5.0.
  2. Move the pointer over the 2023-01-28_15-57-17.png icon in the upper-left corner of the DMS console and choose All Features > Database Development > Data Change > Data Import.

    Note

    If you use the DMS console in normal mode, choose Database Development > Data Change > Data Import in the top navigation bar.

  3. On the Data Change Tickets page, configure the ticket parameters. The following table describes the key parameters.

    Parameter

    Description

    Database

    The database to which data is to be imported. You can select only one database from the Database drop-down list at a time.

    Note

    If the database is managed in Flexible Management or Stable Change mode, make sure that you have logged on to the database. Otherwise, the database is not displayed in the Database drop-down list.

    Execution Method

    The way in which you want the ticket to be run. By default, Ticket Submitter Executes Upon Approval is selected. You can also set this parameter to one of the following values:

    • Automatically Execute Upon Approval

    • Last Approver Executes

    File Encoding

    The encoding method of the database. By default, Automatic Identification is selected. You can also set this parameter to one of the following values:

    • UTF-8

    • GBK

    • ISO-8859-1

    Import Mode

    The mode in which data is to be imported. Valid values:

    • Speed Mode: In the Execute step, the SQL statements in the uploaded file are read and directly executed to import data to the destination database. The speed mode is less secure but faster than the security mode.

      Note

      By default, Speed Mode is disabled for a database instance based on security rules. You can enable the speed mode by performing the following operations: Go to the Details page of the security rule set that is applied to the database instance and click the SQL Correct tab. In the list below the Basic Configuration Item checkpoint, find the "Whether data import supports selecting speed mode" rule and click Edit in the Actions column. In the dialog box that appears, turn on Configuration Value.

    • Security Mode: In the Precheck step, the uploaded file is parsed, and the SQL statements or CSV file data in the uploaded file is cached. In the Execute step, the cached SQL statements are read and executed to import data, or the cached CSV file data is read and imported to the destination database. The security mode is more secure but slower than the speed mode.

    File Type

    The format of the file for the data import. Valid values:

    • SQL Script: By default, you can use only the INSERT and REPLACE statements to import data to database instances that are managed in Security Collaboration mode. If you want to use other SQL statements to import data, modify the security rules for data import as a database administrator (DBA) or DMS administrator. You can modify the security rules by performing the following operations: Go to the Details page of the security rule set that is applied to the database instance and click the SQL Correct tab. In the list below the Batch Data import rules checkpoint, modify the security rules based on your business requirements.

    • CSV: The delimiters in the file must be commas (,).

    • Excel: The file can contain table headers and data, or contain only data. Table headers contain the attributes of data.

    Destination Table Name

    The destination table to which data is to be imported.

    Note

    This parameter is displayed after you set the File Type parameter to CSV or Excel.

    Data Location

    The type of the file whose data is to be imported. Valid values:

    • 1st behavioral attributes: The first row of the table contains field names.

    • 1st behavioral data: The first row of the table contains data.

    Write Mode

    The mode that you want to use to write the imported data to the destination table. Valid values:

    • INSERT: The database checks the primary key during data insertion. If duplicate primary key values are detected, an error is reported.

    • INSERT_IGNORE: If the imported data contains data records that are the same as those in the destination table, the new data records are ignored.

    • REPLACE_INTO: If the imported data contains a row that has the same value for the primary key or unique index as an existing row in the destination table, the system deletes the existing row and inserts the new row into the destination table.

    Note

    You can use the INSERT INTO, INSERT IGNORE, or REPLACE INTO statement to write data to ApsaraDB RDS for MySQL, PolarDB for MySQL, PolarDB-X, AnalyticDB for MySQL, and ApsaraDB for OceanBase databases. You can use only the INSERT INTO statement to write data to other databases.

    Attachment

    The file that you want to import. Valid values: File and Upload OSS Object.

    If you want to upload an object that is stored in Object Storage Service (OSS), first register the OSS data source with DMS. After the OSS data source is registered, click Upload OSS Object. In the Upload OSS Object dialog box, enter the file path. Example: examplefolder/example.sql.

    Note
    • Supported file formats include SQL, CSV, TXT, XLSX, and ZIP.

    • The file to be imported can be up to 5 GB in size.

    • For more information about how to register an OSS data source with DMS, see Register an Alibaba Cloud database instance.

    Other Options

    Optional. Specifies whether to skip errors.

    • By default, the check box is cleared. If an error occurs, DMS stops executing SQL statements and returns an error message.

    • If you select the check box, DMS skips errors and continues to execute SQL statements.

    SQL Statements for Rollback

    • Text: the SQL statements for rolling back the data import operation. Enter the SQL statements in the SQL Text field.

    • Attachment: the SQL file for rollback. Upload the SQL file.

      Note
      • Supported file formats include SQL, TXT, and ZIP.

      • The file for rollback can be up to 15 MB in size.

    Change Stakeholder

    Optional. The stakeholders involved in the data import. All specified stakeholders can view the ticket details and take part in the approval process. Other users except for DMS administrators and DBAs are not allowed to view the ticket details.

  4. Click Submit and wait until the precheck is complete. If the ticket fails the precheck, troubleshoot the issue and click Retry.

    Note
    • If the uploaded file is an SQL script, DMS prechecks the SQL statements in the uploaded file. If the uploaded file is a CSV file, DMS generates INSERT statements based on the uploaded file.

    • If an error is reported during the type check of the Precheck step, modify the security rules in the security rule set that is applied to the database instance based on the error message. For more information, see Data change.

  5. In the Approval step, click Submit for Approval. In the Prompt message, click OK.

  6. After the ticket is approved, click Execute Change in the Execute step.

  7. In the Task Settings dialog box, specify the time to run the task.

    You can use one of the following methods to run the task:

    • Running immediately: By default, this option is selected. If you select this option, the task is immediately run after you click Confirm Execution.

    • Schedule: If you select this option, you must specify the start time for the task. After you click Confirm Execution, the task is run at the specified point in time.

    Note
    • When the task is run, DMS reads the SQL statements in streaming mode and executes the SQL statements in batches. Each batch of SQL statements is 1 MB in size.

    • In the Execute step, you can view the status, SQL check details, and scheduling logs of the task.

    • If you want to restart a task that is suspended, the task is run from the beginning or the offset of suspension based on the import mode.

      • Speed mode: If the task is suspended and restarted, the SQL script is executed or the data files are imported from the beginning.

      • Security mode: If the task is suspended and restarted, the SQL script is executed or the data files are imported from the offset of suspension.

  8. Wait until a message appears, which indicates that the data is imported.

  9. Optional. Query the imported data.

    Go to the SQLConsole tab to query the imported data. For more information, see Manage a database on the SQLConsole tab.

Sample files

FAQ

  • Q: What do I do if an error "permission denied for schema XXXXX" occurs during data import?

    A: This error indicates that the account that is used to log on to the database in DMS does not have the required permissions. You can log on to the database in DMS by using a privileged account and retry the data import. Alternatively, you can import the data to another accessible public database. For more information about how to change the database account, see Modify database instances.

    If the error persists, you can use Data Transmission Service (DTS) to migrate data. For more information, see Overview of data migration scenarios.

  • Q: What do I do if the data import ticket fails the precheck and the system reports that submitting specific statements is not allowed?

    A: You need to modify the security rules as a DBA or DMS administrator to allow the SQL statements of a specific type to be executed during data import. The following example shows how to allow the CREATE TABLE statement to be executed during data import:

    1. Log on to the DMS console and go to the Security Rules page.

    2. On the Security Rules page, find the security rule set that you want to manage and click Edit in the Actions column. On the Details page, click the SQL Correct tab.

    3. On the SQL Correct tab, set the Checkpoints parameter to Batch Data import rules. Find the Allow SQL statements to be executed during data import rule and click Edit in the Actions column.

    4. In the dialog box that appears, add the CREATE_TABLE statement in the Rule DSL field and click Submit. For more information, see SQL Correct.

    5. Go back to the details page of the data import ticket. In the Precheck step, click Retry.

  • Q: How do I upload a file whose size is larger than 5 GB?

    A: You can use one of the following methods based on your business requirements:

    • Split the file into multiple small files and submit multiple tickets to upload the small files.

    • Use Navicat to remotely connect to the database, and then upload the file.