All Products
Search
Document Center

Simple Log Service:Create a data transformation job (new version)

Last Updated:Mar 07, 2025

Simple Log Service allows you to use the data transformation feature (new version) to consume data in a source logstore based on Simple Log Service Processing Language (SPL) rules and write the SPL-based transformation results to a destination logstore. You can also query and analyze the transformed data to meet more business requirements. This topic describes how to create a data transformation job in the Simple Log Service console.

Prerequisites

Procedure

  1. Log on to the Simple Log Service console.

  2. Go to the data transformation page.

    1. In the Projects section, click the project that you want to manage.

    2. On the Log Storage > Logstores tab, click the logstore that you want to manage.

    3. On the query and analysis page, click Data Transformation.

  3. In the upper-right corner of the page, specify a time range for the log data that you want to transform.

    After you select a time range, verify that logs appear on the Raw Logs tab.

  4. In the edit box, enter an SPL statement.

    For more information about the SPL syntax, see SPL syntax.

  5. Preview transformation results.

    1. On the Raw Data tab, click Add Test Data. You can also manually enter test data on the Test Data tab.

      image

    2. Click image to debug the SPL statement. For more information, see Debug SPL rules.

  6. Create a data transformation job.

    1. Click Save as Transformation Job (New Version).

    2. In the Create Data Transformation Job (New Version) panel, configure the following parameters and click OK.

    Parameter

    Description

    Job Name

    The name of the data transformation job.

    Display Name

    The display name of the job.

    Job Description

    The description of the job.

    Authorization Method

    The method that is used to authorize the data transformation job to read data from the source logstore. Options:

    • Default Role: allows the data transformation job to use the Alibaba Cloud system role AliyunLogETLRole to read data from the source logstore. You must authorize the system role AliyunLogETLRole. Then, configure other parameters as prompted to complete the authorization. For more information, see Access data by using a default role.

      Important

      If you use a RAM user, make sure that the authorization is complete within your Alibaba Cloud account.

      If the authorization is complete within your Alibaba Cloud account, you can skip this operation.

    • Custom Role: authorizes the data transformation job to assume a custom role to read data from the source logstore.

    You must grant the custom role the permissions to read from the source logstore. Then, you must enter the Alibaba Cloud Resource Name (ARN) of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

    Storage Destination

    Destination Name

    The name of the storage destination. In the Storage Destination area, you must configure parameters including Destination Project and Target Store.

    Destination Region

    The region of the project to which the destination logstore belongs.

    Destination Project

    The name of the project to which the destination logstore belongs. The destination project can be dynamically specified by the SPL rules. For more information, see Output configuration of processing results. If specified dynamically by the SPL rules, the specified project will be used. Otherwise, the default project will be used.

    Important

    The project dynamically specified by the SPL rules must match the currently configured region and authorization.

    Target Store

    The name of the destination logstore, which stores transformed data. The destination logstore can be dynamically specified by the SPL rules. For more information, see Output configuration of processing results. If specified dynamically by the SPL rules, the specified logstore will be used. Otherwise, the default logstore will be used.

    Important

    The logstore dynamically specified by the SPL rules must match the currently configured region, authorization, and project.

    Authorization Method

    The method that is used to authorize the data transformation job to write transformed data to the destination logstore. Options:

    • Default Role: allows the data transmission job to assume the Alibaba Cloud system role AliyunLogETLRole to write data transformation results to the destination logstore. You must authorize the system role AliyunLogETLRole. Then, configure other parameters as prompted to complete the authorization. For more information, see Access data by using a default role.

    Important

    If you use a RAM user, make sure that the authorization is complete within your Alibaba Cloud account.

    If the authorization is complete within your Alibaba Cloud account, you can skip this operation.

    • Custom Role: authorizes the data transformation job to assume a custom role to write transformed data to the destination logstore. You must grant the permission to the custom role to write data to the destination logstore. Then, enter the Alibaba Cloud Resource Name (ARN) of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

    Write to Result Set

    The dataset to be written to the destination logstore. For more information, see Dataset description. Multiple datasets can be configured for a single destination, and a single dataset can be selected by multiple destinations.

    Time Range for Data Transformation

    Time Range for Data Transformation (Data Receiving Time)

    The time range of the data that is transformed.

    • All: The job transforms data in the source logstore from the first log until the job is manually stopped.

    • From Specific Time: The job transforms data in the source logstore from the log that is received at the specified start time until the job is manually stopped.

    • Specific Time Range: The job transforms data in the source logstore from the log that is received at the specified start time to the log that is received at the specified end time.

    Advanced Options

    Advanced Parameter Settings

    You may need to specify passwords such as database passwords in transformation statements. Simple Log Service allows you to add a key-value pair to store the passwords. You can specify res_local("key") in your statement to reference the passwords.

    You can click the + icon to add more key-value pairs. For example, you can add config.vpc.vpc_id.test1:vpc-uf6mskb0b****n9yj to indicate the ID of the virtual private cloud (VPC) to which an ApsaraDB RDS instance belongs.高级参数配置

What to do next

After the data transformation job is created, you can perform the following operations:

  • On the Data Transformation Overview page, view the details and status of the job. You can also perform other operations, such as modifying or stopping the job.

  • In the destination logstore, perform query and analysis operations. For more information, see Guide to log query and analysis.