All Products
Search
Document Center

Simple Log Service:Create a data transformation job

Last Updated:Sep 02, 2024

Simple Log Service allows you to create a data transformation job to read data from a source Logstore and write transformed data to one or more destination Logstores. You can also query and analyze the transformed data to create more value. This topic describes how to create a data transformation job in the Simple Log Service console.

Prerequisites

Procedure

  1. Log on to the Simple Log Service console.

  2. Go to the data transformation page.

    1. In the Projects section, click the project that you want to manage.

    2. On the Log Storage > Logstores tab, click the Logstore that you want to manage.

    3. On the query and analysis page, click Data Transformation.

  3. In the upper-right corner of the page that appears, specify the time range of the data that you want to manage.

    After you specify the time range, verify that logs appear on the Raw Logs tab.

  4. In the code editor, enter a data transformation statement.

    For more information about the statement syntax, see Language introduction.

  5. Preview data.

    1. Select Quick.

      You can select Quick or Advanced. For more information, see Preview mode overview.

    2. Click Preview Data.

      View the results.

      • If data fails to be transformed because the specified statement is invalid or the granted permissions are invalid, follow the on-screen instructions to troubleshoot the failure.

      • If the transformed data is returned as expected, go to Step 6.

  6. Create a data transformation job.

    1. Click Save as Transformation Job(Old Version).

    2. In the Create Data Transformation Job panel, configure the parameters and click OK. The following table describes the parameters.

      Parameter

      Description

      Job Name

      The name of the data transformation job.

      Display Name

      The display name of the data transformation job.

      Job Description

      The description of the data transformation job.

      Authorization Method

      The method that is used to authorize the data transformation job to read data from the source Logstore.

      • Default Role: The data transformation job assumes the AliyunLogETLRole system role to read data from the source Logstore.

      • Custom Role: The data transformation job assumes a custom role to read data from the source Logstore.

        You must grant the custom role the permissions to read from the source Logstore. Then, you must enter the Alibaba Cloud Resource Name (ARN) of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

      • AccessKey Pair: The data transformation job uses the AccessKey pair of an Alibaba Cloud account or a RAM user to read data from the source Logstore.

        • Alibaba Cloud account: The AccessKey pair of an Alibaba Cloud account has permissions to read from the source Logstore. You can directly enter the AccessKey ID and AccessKey secret of the Alibaba Cloud account in the AccessKey ID and AccessKey Secret fields. For more information about how to obtain an AccessKey pair, see AccessKey pair.

        • RAM user: You must grant the RAM user the permissions to read from the source Logstore. Then, you can enter the AccessKey ID and AccessKey secret of the RAM user in the AccessKey ID and AccessKey Secret fields. For more information, see Access data by using AccessKey pairs.

      Storage Destination

      Destination Name

      The name of the storage destination. In the Storage Destination section, you must configure parameters including Destination Project and Target Store.

      You can create multiple storage destinations to store the transformed data in different destination Logstores.

      • You can use the name parameter of the e_output or e_coutput function in the transformation statement to specify the name of the storage destination. For more information, see e_output and e_coutput.

      • If you do not include the e_output function in the transformation statement, the job writes the transformed data to the Logstore in the storage destination that is numbered 1 by default.

        If you want to configure only one destination Logstore, you do not need to include the e_output function in the transformation statement.

      • If you include the e_output or e_coutput function and configure the name, project, and logstore parameters for the function, the job runs based on the parameter settings in the functions even if you configure the Destination Project and Target Store parameters in this step.

      Destination Region

      The region of the project to which the destination Logstore belongs.

      If you want to perform data transformation across regions, we recommend that you use HTTPS for data transmission. This ensures the privacy of log data.

      For cross-region data transformation, the data is transmitted over the Internet. If the Internet connections are unstable, data transformation latency may exist. You can select DCDN Acceleration to accelerate the cross-region data transmission. Before you can select DCDN Acceleration, make sure that the global acceleration feature is enabled for the destination project. For more information, see Log collection acceleration.

      Note

      When data is transmitted over the Internet across regions, you are charged for traffic that is generated. The volume of traffic is calculated based on the size of data after compression. For more information, see Billable items of pay-by-feature.

      Destination Project

      The name of the project to which the destination Logstore belongs.

      Target Store

      The name of the destination Logstore, which stores transformed data.

      Authorization Method

      The method that is used to authorize the data transformation job to write transformed data to the destination Logstore.

      • Default Role: The data transformation job assumes the AliyunLogETLRole system role to write transformed data to the destination Logstore.

      • Custom Role: The data transformation job assumes a custom role to write transformed data to the destination Logstore.

        You must grant the custom role the permissions to write to the destination Logstore. Then, you must enter the ARN of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

      • AccessKey Pair: The data transformation job uses the AccessKey pair of an Alibaba Cloud account or a RAM user to write transformed data to the destination Logstore.

        • Alibaba Cloud account: The AccessKey pair of an Alibaba Cloud account has permissions to write to the destination Logstore. You can directly enter the AccessKey ID and AccessKey secret of the Alibaba Cloud account in the AccessKey ID and AccessKey Secret fields. For more information about how to obtain an AccessKey pair, see AccessKey pair.

        • RAM user: You must grant the RAM user the permissions to write to the destination Logstore. Then, you can enter the AccessKey ID and AccessKey secret of the RAM user in the AccessKey ID and AccessKey Secret fields. For more information, see Access data by using AccessKey pairs.

      Time Range for Data Transformation

      Time Range

      The time range of the data that is transformed.

      Note

      The time range varies based on the time when logs are received.

      • All: The job transforms data in the source Logstore from the first log until the job is manually stopped.

      • From Specific Time: The job transforms data in the source Logstore from the log that is received at the specified start time until the job is manually stopped.

      • Within Specific Period: The job transforms data in the source Logstore from the log that is received at the specified start time to the log that is received at the specified end time.

      Advanced Options

      Advanced Parameter Settings

      You may need to specify passwords such as database passwords in transformation statements. Simple Log Service allows you to add a key-value pair to store the passwords. You can specify res_local("key") in your statement to reference the passwords.

      You can click the + icon to add more key-value pairs. For example, you can add config.vpc.vpc_id.test1:vpc-uf6mskb0b****n9yj to indicate the ID of the virtual private cloud (VPC) to which an ApsaraDB RDS instance belongs.高级参数配置

What to do next

After the data transformation job is created, you can perform the following operations:

  • On the Data Transformation Overview page, view the details of the job. You can also perform other operations, such as modifying or stopping the job. For more information, see Manage a data transformation job.

  • In the destination Logstore, perform query and analysis operations. For more information, see Query and analyze logs.