All Products
Search
Document Center

Simple Log Service:Getting started with data transformation

Last Updated:Aug 28, 2025

This topic uses website access logs as an example to walk you through the data transformation process. This helps you quickly understand the data transformation feature and its operations.

Prerequisites

  • A project named web-project is created. For more information, see Manage projects.

  • A source Logstore named website_log is created in the web-project project. For more information, see Create a Logstore.

  • Website access logs are collected in the source Logstore (website_log). For more information, see Data collection.

  • Destination Logstores are created in the web-project project. The details are as follows:

    Destination Logstore

    Description

    website-success

    Stores logs of successful access requests in the website-success Logstore, which corresponds to the target-success destination.

    website-fail

    Stores logs of failed access requests in the website-fail Logstore, which corresponds to the target-fail destination.

    website-etl

    Stores other access logs in the website-etl Logstore, which corresponds to the target0 destination.

  • If you use a Resource Access Management (RAM) user, grant the RAM user permissions to perform data transformation operations. For more information, see Grant a RAM user permissions to perform data transformation operations.

  • Indexes are configured for the source and destination Logstores. For more information, see Create indexes.

    Important

    Data transformation tasks do not depend on indexes. However, if you do not configure indexes, you cannot perform query and analysis operations.

Background information

A website stores all its access logs in a single Logstore. The website owner wants to set different topics for successful and failed access logs and distribute them to different Logstores for analysis. The following is a sample log:

body_bytes_sent:1061
http_user_agent:Mozilla/5.0 (Windows; U; Windows NT 5.1; ru-RU) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5
remote_addr:192.0.2.2
remote_user:vd_yw
request_method:DELETE
request_uri:/request/path-1/file-5
status:207
time_local:10/Jun/2021:19:10:59

Step 1: Create a data transformation task

  1. Log on to the Simple Log Service console.

  2. Go to the data transformation page.

    1. In the Projects section, click the project you want.

    2. On the Log Storage > Logstores tab, click the logstore you want.

    3. On the query and analysis page, click Data Transformation.

  3. In the upper-right corner of the page that appears, specify the time range of the data that you want to manage.

    After you specify the time range, verify that logs appear on the Raw Logs tab.

  4. In the editor, enter the transformation statements.

    e_if(e_search("status:[200,299]"),e_compose(e_set("__topic__","access_success_log"),e_output(name="target-success")))
    e_if(e_search("status:[400,499]"),e_compose(e_set("__topic__","access_fail_log"),e_output(name="target-fail")))

    The e_if function executes the specified operations if the condition is true. For more information, see e_if.

    • Condition: e_search("status:[200,299]")

      If the value of the status field meets the condition, Operation 1 and Operation 2 are performed. For more information, see e_search.

    • Operation 1: e_set("__topic__","access_success_log")

      Adds the __topic__ field and sets its value to access_success_log. For more information, see e_set.

    • Operation 2: e_output(name="target-success", project="web-project", logstore="website-success")

      Saves the transformed data to the destination Logstore named website-success. For more information, see e_output.

  5. Preview the data.

    1. Click Quick.

      Simple Log Service supports quick and advanced previews. For more information, see Preview and debug data.

    2. Click Preview Data.

      View the preview results.

      Important

      When you preview data, the logs are not sent to the destination Logstore. Instead, they are sent to a Logstore named internal-etl-log. The internal-etl-log Logstore is automatically created in the current project when you preview data for the first time. You cannot modify its configurations or write other data to it. This Logstore is free of charge.

      预览结果

  6. Create the data transformation task.

    1. Click Save As Transformation Job.

    2. In the Create Data Transformation Task panel, you can configure the following parameters.

      Parameter

      Description

      Task Name

      The name of the data transformation task.

      Authorization Method

      You can use one of the following methods to grant the data transformation task permissions to read data from the source Logstore.

      • Default Role: The data transformation job assumes the AliyunLogETLRole system role to read data from the source Logstore.

      • Custom Role: The data transformation job assumes a custom role to read data from the source Logstore.

        You must grant the custom role the permissions to read from the source Logstore. Then, you must enter the Alibaba Cloud Resource Name (ARN) of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

      Destination

      Destination Name

      The name of the destination. The destination includes configurations such as the project and Logstore.

      This must be the same as the name parameter that you configured in Step 4.

      Note

      Simple Log Service uses the destination configured with ordinal number 1 (target0 in this example) as the default destination. This destination stores logs that do not meet any of the specified conditions.

      Destination Region

      Select the region where the destination project is located.

      Cross-region data transformation uses the HTTPS protocol to ensure data privacy.

      Cross-region data transformation transmits data over the Internet. This may cause task latency due to network uncertainties. You can select the DCDN Acceleration check box to speed up cross-region data transmission. If you use DCDN acceleration, make sure that acceleration is enabled for the project. For more information, see Enable acceleration for data collection.

      Important

      When you pull data from the public endpoint of Simple Log Service, Internet traffic fees are incurred for data reads. The fees are calculated based on the amount of compressed data. For more information, see Pay-as-you-go.

      Destination Project

      The name of the destination project where the transformed data is stored.

      Destination Logstore

      The name of the destination Logstore where the transformed data is stored.

      Authorization Method

      You can use one of the following methods to grant the data transformation task permissions to write data to the destination Logstore.

      • Default Role: The data transformation job assumes the AliyunLogETLRole system role to write transformed data to the destination Logstore.

      • Custom Role: The data transformation job assumes a custom role to write transformed data to the destination Logstore.

        You must grant the custom role the permissions to write to the destination Logstore. Then, you must enter the ARN of the custom role in the Role ARN field. For more information, see Access data by using a custom role.

      Transformation Scope

      Time Range

      Specify the time range for the data transformation task. The details are as follows:

      Note

      This time range is based on the time when the logs are received.

      • All: Starts the data transformation task from the time when the first log is received by the Logstore. The task runs until it is manually stopped.

      • From A Specific Time: Specifies the start time for the data transformation task. The task starts at the specified time and runs until it is manually stopped.

      • Specific Time Range: Specifies the start and end times for the data transformation task. The task automatically stops at the specified end time.

    3. Click OK.

After the logs are distributed to the destination Logstores, you can query and analyze them. For more information, see Quick start for query and analysis.

Step 2: View the data transformation task

  1. In the navigation pane on the left, choose Task Management > Data Transformation.

  2. In the list of data transformation tasks, click the destination data transformation task.

  3. On the Data Transformation Overview page, you can view the details of the data transformation task.

    You can view the task details and status. You can also modify, start, stop, or delete the task. For more information, see Manage data transformation tasks.

    加工任务