All Products
Search
Document Center

Simple Log Service:Quick Start

Last Updated:Mar 13, 2026

This topic uses website access logs as an example and describes the complete data transformation process. This helps you quickly become familiar with the data transformation feature and its operations.

Preparations

  • Create a project named web-project. For more information, see Manage projects.

  • In the web-project project, create a source Logstore named website_log. For more information, see Manage Logstores.

  • Website access logs have been ingested into the source LogStore (website_log). For more information, see Data Ingestion Overview.

  • In the web-project project, create a destination Logstore named website_fail.

  • If you use a Resource Access Management (RAM) user, grant the RAM user permissions to perform data transformation operations. For more information, see Grant a RAM user permissions to perform data transformation operations.

  • Configure indexes for the source and destination Logstores. For more information, see Create indexes.

Note

Data transformation jobs do not depend on indexes. However, if you do not configure indexes, you cannot perform query and analysis operations.

Background information

A website stores all its access logs in a Logstore named website_log. To improve user experience, you must analyze access errors. You need to filter access logs that have a status code of 4xx, remove personal user information, and write the results to a new Logstore named website_fail for business analysts to use. The following is a sample log:

body_bytes_sent: 1061
http_user_agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; ru-RU) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5
remote_addr: 192.0.2.2
remote_user: vd_yw
request_method: GET
request_uri: /request/path-1/file-5
status: 400
time_local: 10/Jun/2021:19:10:59
error: Invalid time range

Step 1: Create a data transformation job

  1. Log on to the Simple Log Service console.

  2. Go to the data transformation page.

    1. In the Projects section, click the project you want.

    2. On the Log Storage > Logstores tab, click the logstore you want.

    3. On the query and analysis page, click Data Transformation.

  3. In the upper-right corner of the page, specify a time range for the log data that you want to transform.

    After you select a time range, verify that logs appear on the Raw Logs tab.

  4. In the editor, enter the following Structured Process Language (SPL) rule.

    *
    | extend status=cast(status as BIGINT)
    | where status>=0 AND status<500
    | project-away remote_addr, remote_user
    
  5. Debug the SPL rule.

    1. Select test data from the Raw Data tab or manually enter test data.image.png

    2. Click to run the test.

    3. View the preview results.image.png

  6. Create a data transformation job.

    1. Click Save as Transformation Job (New Version).

    2. In the Create Data Transformation Job (New Version) panel, configure the following parameters and click OK.

    Parameter

    Description

    Job Name

    The name of the data transformation job.

    Display Name

    The display name of the job.

    Job Description

    The description of the job.

    Authorization Method

    The method that is used to authorize the data transformation job to read data from the source logstore. Options:

    • Default Role: allows the data transformation job to use the Alibaba Cloud system role AliyunLogETLRole to read data from the source logstore. You must authorize the system role AliyunLogETLRole. Then, configure other parameters as prompted to complete the authorization. For more information, see Access data using a default role.

      Important

      If you use a RAM user, make sure that the authorization is complete within your Alibaba Cloud account.

      If the authorization is complete within your Alibaba Cloud account, skip this operation.

    • Custom Role: authorizes the data transformation job to assume a custom role to read data from the source logstore.

      You must grant the custom role the permissions to read from the source logstore. Then, you must enter the Alibaba Cloud Resource Name (ARN) of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

    Storage Destination

    Destination Name

    The name of the storage destination. In the Storage Destination area, you must configure parameters including Destination Project and Target Store.

    Destination Region

    The region of the project to which the destination logstore belongs.

    Destination Project

    The name of the project to which the destination logstore belongs. The destination project can be dynamically specified by the SPL rules. For more information, see Output configuration of processing results. If specified dynamically by the SPL rules, the specified project will be used. Otherwise, the default project will be used.

    Important

    The project dynamically specified by the SPL rules must match the currently configured region and authorization.

    Target Store

    The name of the destination logstore, which stores transformed data. The destination logstore can be dynamically specified by the SPL rules. For more information, see Output configuration of processing results. If specified dynamically by the SPL rules, the specified logstore will be used. Otherwise, the default logstore will be used.

    Important

    The logstore dynamically specified by the SPL rules must match the currently configured region, authorization, and project. The source and destination logstores must be different.

    Authorization Method

    The method that is used to authorize the data transformation job to write transformed data to the destination logstore. Options:

    • Default Role: allows the data transmission job to assume the Alibaba Cloud system role AliyunLogETLRole to write data transformation results to the destination logstore. You must authorize the system role AliyunLogETLRole. Then, configure other parameters as prompted to complete the authorization. For more information, see Access data using a default role.

    Important

    If you use a RAM user, make sure that the authorization is complete within your Alibaba Cloud account.

    If the authorization is complete within your Alibaba Cloud account, skip this operation.

    • Custom Role: authorizes the data transformation job to assume a custom role to write transformed data to the destination logstore. You must grant the permission to the custom role to write data to the destination logstore. Then, enter the Alibaba Cloud Resource Name (ARN) of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

    Write to Result Set

    The dataset to be written to the destination logstore. For more information, see Dataset description. Multiple datasets can be configured for a single destination, and a single dataset can be selected by multiple destinations.

    Time Range for Data Transformation

    Time Range for Data Transformation (Data Receiving Time)

    The time range of the data that is transformed.

    • All: The job transforms data in the source logstore from the first log until the job is manually stopped.

    • From Specific Time: The job transforms data in the source logstore from the log that is received at the specified start time until the job is manually stopped.

    • Specific Time Range: The job transforms data in the source logstore from the log that is received at the specified start time to the log that is received at the specified end time.

    Advanced Options

    Advanced Parameter Settings

    You may need to specify passwords such as database passwords in transformation statements. SLS lets you add a key-value pair to store the passwords. Specify res_local("key") in your statement to reference the passwords.

    Click the + icon to add more key-value pairs. For example, add config.vpc.vpc_id.test1:vpc-uf6mskb0b****n9yj to indicate the ID of the virtual private cloud (VPC) to which an ApsaraDB RDS instance belongs.高级参数配置

  7. Go to the destination Logstore (website_fail) to perform query and analysis operations. For more information, see Quick guide to query and analysis.

Step 2: Observe the data transformation job

  1. In the left-side navigation pane, choose Job Management > Data Transformation.

  2. In the list of data transformation jobs, find and click the data transformation job that you want to manage.

  3. On the Data Transformation Overview (New Version) page, view the details of the data transformation job. You can view the job details and status. You can also modify, start, stop, and delete the job. For more information, see Manage data transformation jobs (new version). You can also observe the job's running status and metrics. For more information, see Observe and monitor data transformation jobs (new version).