All Products
Search
Document Center

Simple Log Service:Ship logs to AnalyticDB for MySQL

Last Updated:Aug 11, 2023

After you collect logs by using Simple Log Service, you can ship the logs to AnalyticDB for MySQL for data storage and analysis. This topic describes how to ship logs from Simple Log Service to AnalyticDB for MySQL.

Note

Data is shipped to AnalyticDB for MySQL based on the REPLACE INTO semantics. You can execute the REPLACE INTO statement to insert data to a table by overwriting the existing data in real time. When you write data, the REPLACE INTO statement checks whether the data to be written exists in the table based on the primary key and then writes the data based on the check result.

  • If the data to be written already exists, the statement overwrites the existing data.

  • If the data to be written does not exist, the statement inserts the data.

For more information, see REPLACE INTO.

Prerequisites

  • Logs are collected to a destination Logstore. For more information, see Data collection overview.

  • The following operations are performed in the AnalyticDB for MySQL console:

    1. An AnalyticDB for MySQL cluster is created in the region where the Simple Log Service project resides. For more information, see Create a cluster.

      Note

      Simple Log Service can ship logs only in the same region.

    2. A database account is created. For more information, see Create a database account.

    3. A database is created. For more information, see Create a database.

    4. If you need to access AnalyticDB for MySQL clusters over the Internet, you must first apply for a public endpoint. For more information, see Apply for or release a public endpoint.

    5. A table is created in the database. For more information, see CREATE TABLE.

Create a data shipping job

  1. Log on to the Log Service console.
  2. In the Projects section, click the project that you want to manage.

  3. On the Log Storage > Logstores tab, click the > icon next to the destination Logstore. Then, choose Data Transformation > Export, and click + next to AnalyticDB.

  4. In the Shipping Notes dialog box, click Ship.

  5. The first time you create a data shipping job to ship logs from Simple Log Service to AnalyticDB for MySQL, you must create the AliyunServiceRoleForAnalyticDBForMySQL service-linked role.

    1. In the Create Service Linked Role dialog box, click AliyunServiceRoleForAnalyticDBForMySQL.

      Create a service-linked role
    2. In the Create Service Linked Role message, click OK.

  6. On the LogHub - Data Shipper page, configure the parameters and click OK.

    Parameter

    Description

    Shipper Name

    The name of the data shipping job.

    Shipper Description

    The description of the data shipping job.

    Cluster Version

    The version of the AnalyticDB for MySQL cluster. In this example, 3.0 is selected.

    Cluster Name

    The AnalyticDB for MySQL cluster that you created.

    Database Name

    The database that you created in the AnalyticDB for MySQL cluster.

    Table Name

    The table that you created in the AnalyticDB for MySQL cluster.

    Account Name

    The account of the database that you created in the AnalyticDB for MySQL cluster.

    Account Password

    The account password of the database that you created in the AnalyticDB for MySQL cluster.

    Field Mapping

    Simple Log Service extracts all log fields of the last 10 minutes and maps the fields to the destination fields in the AnalyticDB for MySQL table. Enter the names of log fields in the left text box. Enter the names of fields in the AnalyticDB for MySQL table in the right text box.

    Note

    The field mapping of fields of the timestamp type is accurate to seconds. For example, the timestamp of a log in Simple Log Service is 2022-01-05 10:12:13 13.145. After the log is shipped to AnalyticDB for MySQL, the timestamp is recorded as 2022-01-05 10:12:13.

    Data shipping job 2

    Delivery Start Time

    The start time of the data shipping job.

    After logs are collected to Simple Log Service, the logs are shipped to AnalyticDB for MySQL in real time.

    Filter Dirty Data

    Dirty data includes data whose data type fails to be converted and whose required fields are empty.

    • If you turn on Filter Dirty Data, the data shipping job filters out dirty data.

    • If you turn off Filter Dirty Data and dirty data exists, the data shipping job is interrupted. Proceed with caution.

After you create a data shipping job, you can manage the job in the Simple Log Service console. You can view the job details, modify the shipping rule, and start, stop, or delete the job. For more information, see Manage a data shipping job.

View log data

After logs are shipped to AnalyticDB for MySQL, you can execute a SELECT statement to query log data.