You can import data from a self-managed MySQL database or a MySQL database on an ApsaraDB RDS for MySQL instance to Simple Log Service. This way, you can query, analyze, and transform the data in Simple Log Service. This topic describes how to import data from a MySQL database to Simple Log Service.
Prerequisites
A MySQL database is created. A database account is created and is granted minimum read permissions.
An IP address whitelist is configured for the MySQL database.
You must configure an IP address whitelist based on the region of the project to which the data import job belongs and the network of the database. For more information about IP addresses that you can add to a whitelist, see IP address whitelists.
A project and a Logstore are created. For more information, see Create a project and Create a Logstore.
Import data
Data import from a MySQL database is available in the following regions: China (Hangzhou), China (Shanghai), China (Qingdao), China (Beijing), China (Zhangjiakou), China (Hohhot), China (Ulanqab), China (Chengdu), China (Shenzhen), China (Heyuan), China (Guangzhou), China (Hong Kong), Singapore, Australia (Sydney), Malaysia (Kuala Lumpur), Indonesia (Jakarta), Philippines (Manila), Thailand (Bangkok), India (Mumbai), Japan (Tokyo), South Korea (Seoul), UK (London), Germany (Frankfurt), US (Virginia), and US (Silicon Valley).
Log on to the Simple Log Service console.
In the Projects section, click the project that you want to manage.
On the tab, click > to the left of the Logstore, move the pointer over , and then click +.
In the Import Data dialog box, click MySQL - Data Import.
In the Configure Import Settings step, create a data import configuration.
On the Specify Data Source tab, configure the following parameters.
If you use a MySQL database on an ApsaraDB RDS for MySQL instance that belongs to a virtual private cloud (VPC), select Use VPC Endpoint below Configuration Name and ApsaraDB RDS Instance below VPC-based Instance ID. Then, configure the following parameters.
Parameter
Description
VPC-based Instance ID
The ID of the VPC to which the ApsaraDB RDS for MySQL instance belongs.
RDS Instance ID
The ID of the ApsaraDB RDS for MySQL instance. You must set this parameter to the value of VpcCloudInstanceId that is specified for the ApsaraDB RDS for MySQL instance. You can call the DescribeDBInstanceAttribute operation to obtain the value of VpcCloudInstanceId.
If you use a self-managed MySQL database on an Elastic Compute Service (ECS) instance that belongs to a VPC, select Use VPC Endpoint below Configuration Name and Self-managed Database on ECS below VPC-based Instance ID. Then, configure the following parameters.
Parameter
Description
VPC-based Instance ID
The ID of the VPC to which the ECS instance belongs.
ECS Instance IP Address
The private IP address of the ECS instance.
If you use a MySQL database on an ApsaraDB RDS for MySQL instance that is accessible over the Internet or a self-managed MySQL database in other scenarios, clear Use VPC Endpoint and configure the following parameter.
Parameter
Description
Public IP Address or Domain Name of Database
The public IP address or domain name of the MySQL database.
The following table describes common parameters.
Parameter
Description
Configuration Name
The name of the data import configuration.
MySQL Port
The port that is used to connect to the MySQL database. Default value: 3306.
Username
The username of the account that is used to log on to the MySQL database.
We recommend that you grant the account only the read permissions on the MySQL database.
Password
The password of the account that is used to log on to the MySQL database.
Database
The name of the MySQL database.
Query
The query statement. You can enter a SELECT statement or a SHOW statement. If you enter a SHOW statement, you cannot configure Page Size in the Advanced Settings section.
You can use question marks (?) to represent replacement characters. If you use a question mark in the query statement, you must configure CheckPoint Field. For example, if you enter
SELECT * FROM employees WHERE id > ?
in Query, you must set CheckPoint Field toid
and Starting Value of CheckPoint Field to0
. Each time the data import job is run, the system saves the ID of the last data record as a checkpoint. The next time the data import job is run, the system replaces the question mark (?) that is specified in the query statement with the checkpoint.Advanced Settings
Maximum Number of Rows for Each Import
The maximum number of rows that can be imported each time the MySQL data import job is run. The MySQL data import job is scheduled. Default value: 0, which indicates that the maximum number of rows is unlimited.
Connection Timeout Settings
The timeout period of connections to the MySQL database when the MySQL data import job is run. Default value: 3000. Unit: ms.
Query Timeout Settings
The timeout period of queries from the MySQL database when the MySQL data import job is run. Default value: 5000. Unit: ms.
Page Size
The number of data records on each page. The value 0 indicates that no pagination is performed. If you set the parameter to a value that is greater than 0, the system appends
limit pageSize offset {checkpoint}
to the SELECT statement. When the MySQL data import job is run, pagination is performed to import data.Time Field
The time field of the MySQL data. Example: create_time or update_time. The value of this field is considered the time at which data is imported to Simple Log Service.
Time Field Format
The format of the time field. Java-based time formats are supported. Example: yyyy-MM-dd HH:mm:ss. For more information, see Time formats.
Time Zone
The time zone of the time field.
CheckPoint Field
The checkpoint field. A checkpoint indicates a position from which the system starts to import data. For example, if you enter
select * from t where id > ?
in Query, you must set CheckPoint Field toid
.id
must be an auto-increment field.You can also specify other fields, such as create_time and update_time, for CheckPoint Field.
NoteIf you do not specify a time range in the SQL statement, we recommend that you configure CheckPoint Field to import incremental data and prevent repeated data collection.
Starting Value of CheckPoint Field
The initial value of the checkpoint field.
Additional Fields
The additional fields. Format:
map[string]string
. For example, if you enterselect a, b from t;
in Query, the system automatically imports the a and b fields. You can also import other fields to Simple Log Service. Example: {"k1": "v1", "k2": "v2"}.Click Preview to preview the import result.
Confirm the result and click Next.
On the Specify Scheduling Interval tab, configure the following parameters.
Parameter
Description
Import Interval
The interval at which data is imported from the MySQL database to Simple Log Service.
Execute Now
If you turn on Execute Now, the data import job is immediately run.
After the configuration is complete, click Next.
Preview data, configure indexes, and then click Next.
By default, full-text indexing is enabled for Log Service. You can also configure field indexes based on collected logs in manual mode or automatic mode. To configure field indexes in automatic mode, click Automatic Index Generation. This way, Log Service automatically creates field indexes. For more information, see Create indexes.
ImportantIf you want to query and analyze logs, you must enable full-text indexing or field indexing. If you enable both full-text indexing and field indexing, the system uses only field indexes.
Click Log Query. You are redirected to the query and analysis page of your Logstore. On the page, check whether the MySQL data import job is successful.
Wait for approximately 1 minute. If you can query the MySQL data, the data import job is successful.
View the data import configuration and reports
After you create a data import configuration, you can view the configuration and related reports in the Simple Log Service console.
In the Projects section, click the project to which the data import configuration belongs.
Find and click the Logstore to which the data import configuration belongs, choose , and then click the data import configuration.
On the Import Configuration Overview page, view the basic information about the data import configuration and the related reports.
What to do next
On the Import Configuration Overview page, you can perform the following operations:
Modify the data import configuration
To modify the data import configuration, click Modify Settings. For more information, see Import data.
Delete the data import configuration
To delete the data import configuration, click Delete Configuration.
WarningAfter a data import configuration is deleted, it cannot be restored. Proceed with caution.