All Products
Search
Document Center

:Import data from a MySQL database

Last Updated:Aug 15, 2025

You can import data from a self-managed MySQL database or an ApsaraDB RDS for MySQL database into Simple Log Service. After the data is imported, you can query, analyze, and process it. This topic describes how to import data from a MySQL database.

Warning

The MySQL Data Import feature of Simple Log Service will no longer be updated or maintained. The end-of-life date and migration details will be announced later. Use DataWorks for data synchronization. For more information, see MySQL data source.

Limits

  • This feature is available in the following regions: China (Hangzhou), China (Shanghai), China (Qingdao), China (Beijing), China (Zhangjiakou), China (Hohhot), China (Ulanqab), China (Chengdu), China (Shenzhen), China (Heyuan), China (Guangzhou), China (Hong Kong), Singapore, Malaysia (Kuala Lumpur), Indonesia (Jakarta), Philippines (Manila), Thailand (Bangkok), Japan (Tokyo), South Korea (Seoul), UK (London), Germany (Frankfurt), US (Virginia), and US (Silicon Valley).

  • If the data source is AnalyticDB for MySQL, you must convert JSON fields to the VARCHAR type before you import them. For example, if the detail field is of the JSON type, you can use cast(detail as VARCHAR) as detail to convert it. For more information about the CAST function, see CAST functions.

Prerequisites

  • A MySQL database is created. A database account is also created and granted the minimum read permissions.

  • The required IP address is added to the whitelist of the MySQL database.

    Add the IP address to the whitelist based on the region of the project and the network type of the database. For more information about the IP addresses that you can add to a whitelist, see IP address whitelists. For more information about how to add an IP address to the whitelist of a MySQL database, see Configure an IP address whitelist.

  • A project and a Logstore are created. For more information, see Manage a project and Create a Logstore.

Import data

  1. Log on to the Simple Log Service console.

  2. In the Projects section, click the one you want.

    image

  3. On the Log Storage > Logstores tab, click > to the left of the destination Logstore, hover over Data Collection > Data Import, and click +.

    image

  4. In the Quick Data Import dialog box, search for and click MySQL-Data Import.

  5. Set the import configuration.

    1. On the Data Source Settings tab, configure the following parameters.

      Basic parameters

      Parameter

      Description

      Task Name

      The unique name of the SLS task.

      Display Name

      The display name of the task.

      Task Description

      The description of the import task.

      Use VPC Address

      RDS for MySQL database in a VPC

      Select Use VPC Address and RDS Instance, and set the following parameters.

      • VPC Instance ID: The ID of the VPC to which the RDS for MySQL database belongs.

      • RDS Instance ID: Set this parameter to the VpcCloudInstanceId of the RDS instance. You can call the DescribeDBInstanceAttribute operation to obtain the VpcCloudInstanceId.

      Self-managed MySQL database on an ECS instance in a VPC

      Select Use VPC Address and Self-managed Database On ECS, and set the following parameters.

      • VPC Instance ID: The ID of the VPC to which the ECS instance belongs.

      • ECS Instance IP: The private IP address of the ECS instance.

      Database accessible over the internet

      Clear Use VPC Address, and set the following parameter.

      • Database Public IP Or Domain Name: The public IP address or domain name of the MySQL database.

      MySQL Port

      The service port of the MySQL database. The default value is 3306.

      Username

      The name of the account used to log on to the MySQL database.

      Grant only read permissions on the MySQL data to this account.

      Password

      The password of the account.

      Database

      The name of the MySQL database.

      Query Statement

      You can use a SELECT statement or a SHOW statement. If you use a SHOW statement, you cannot set the Page Size parameter in the advanced configuration.

      A question mark (?) can be used as a placeholder that works with the Checkpoint Field. For example, if the query statement is SELECT * FROM employees WHERE id > ?, set Checkpoint Field to id and Checkpoint Field Start Value to 0. After a data import task is complete, the system saves the ID of the last data record as a checkpoint. The next time the task runs, the system replaces the question mark (?) in the query statement with the checkpoint ID.

      Advanced parameters

      Parameter

      Description

      Max Rows Per Import

      The MySQL import task runs on a schedule. You can set the maximum number of rows for each import. The default value is 0, which indicates no limit.

      Connection Timeout

      The timeout period for the MySQL import task to connect to the MySQL database. The default value is 3000 ms.

      Query Timeout

      The timeout period for the MySQL import task to query the MySQL database. The default value is 5000 ms.

      Page Size

      A value of 0 indicates that pagination is disabled. If you set this parameter to a value greater than 0, the system appends limit pageSize offset {checkpoint} to the SELECT statement to import data in batches.

      Time Field

      The field in the MySQL data that represents time, such as create_time or update_time. Simple Log Service uses the value of this field as the data import time.

      Time Field Format

      The format of the time field. Java time formats are supported. Example: yyyy-MM-dd HH:mm:ss. For more information, see Time formats.

      Time Field Time Zone

      Select the time zone of the time field.

      Checkpoint Field

      Select a checkpoint field. This field serves as the starting point for the next data import. For example, if the query statement is select * from t where id > ?, where id is an auto-increment field, you can set id as the checkpoint field.

      You can also use fields such as create_time or update_time.

      Note

      If you do not specify a time range in the SQL statement, set the Checkpoint Field to import incremental data and prevent data duplication.

      Checkpoint Field Start Value

      Set the start value of the checkpoint field.

      Additional Fields

      Add additional fields in the map[string]string format. For example, if the query statement is select a, b from t;, the system imports two fields, a and b, by default. You can add other fields to import into Simple Log Service, such as {"k1": "v1", "k2": "v2"}.

    2. Click Preview to preview the import results. After you confirm that the results are correct, click Next Configuration.

    3. On the Scheduling Interval tab, configure the following parameters.

      Parameter

      Description

      Import Interval

      The interval at which data is imported from the MySQL database to Simple Log Service.

      Execute Immediately

      Turn on Execute Immediately to run the import task once right away.

    4. After you complete the configuration, click Next.

  6. Create indexes and preview data. Then, click Next. By default, full-text indexing is enabled in Simple Log Service. You can also manually create field indexes for the collected logs or click Automatic Index Generation. Then, Simple Log Service generates field indexes. For more information, see Create indexes.

    Important

    If you want to query all fields in logs, we recommend that you use full-text indexes. If you want to query only specific fields, we recommend that you use field indexes. This helps reduce index traffic. If you want to analyze fields, you must create field indexes. You must include a SELECT statement in your query statement for analysis.

  7. You must wait approximately 1 minute for the indexes to take effect. Then, you can view the collected logs on the Raw Logs tab. For more information about how to query and analyze logs, see Guide to log query and analysis.

View the import configuration and statistical reports

After an import configuration is created, you can view the configuration and its statistical reports in the console.

  1. Click the destination project.

  2. In the destination Logstore, choose Data Collection > Data Import and click the destination import configuration.

    image

  3. On the Import Configuration Overview page, view the basic information and statistical reports of the import configuration.

Related operations

On the Import Configuration Overview page, you can also perform the following operations.

  • Modify the configuration

    Click Modify Configuration to modify the import configuration. For more information about the parameters, see Import data.

  • Delete the configuration

    Click Delete Configuration to delete the import configuration.

    Warning

    This operation cannot be undone. Proceed with caution.