All Products
Search
Document Center

Simple Log Service:Process and save data from a Metricstore to another Metricstore

Last Updated:Nov 13, 2023

Simple Log Service provides the Scheduled SQL feature. You can use the feature to analyze data at a scheduled time and aggregate data for storage. You can also use the feature to project and filter data. Simple Log Service allows you to process data in a source Metricstore by configuring a Scheduled SQL job and save the processed data to a destination Metricstore.

Prerequisites

Procedure

Important

The Scheduled SQL feature is in public preview. If you enable the feature, you are charged only for the computing resources that are consumed by Dedicated SQL. For more information, see Billable items of pay-by-feature.

  1. Log on to the Simple Log Service console.

  2. In the Projects section, click the project that you want to manage.

  3. On the Time Series Storage > Metricstore tab, click the Metricstore that you want to manage.

  4. Perform query and analysis operations.

    1. In the upper-right corner of the page, click 15 Minutes(Relative) to specify a time range for the query.

    2. Enter a PromQL statement and click Execute Now.

      For more information, see Query and analyze metric data.

    Note

    This step allows you to preview data before you create a Scheduled SQL job. You can check whether the query statement that you entered is valid and whether the query and analysis results contain data.

  5. Choose More > Save as Scheduled SQL Job.

    创建任务
  6. Create a Scheduled SQL job.

    1. In the Compute Settings step, configure the parameters and click Next. The following table describes the parameters.

      Parameter

      Description

      Job Name

      The name of the Scheduled SQL job.

      Job Description

      The description of the Scheduled SQL job.

      Resource Pool

      The resource pool that is used for data analysis. Simple Log Service provides an enhanced type of resource pool.

      The enhanced type of resource pool utilizes the computing capability of Dedicated SQL to meet concurrent analysis requirements. It isolates resources between Scheduled SQL and your SQL analysis operations in the console. You are charged for the enhanced type of resource pool based on the CPU time that is consumed by your SQL analysis operations. For more information, see Enable Dedicated SQL.

      Write Mode

      Select Import Data from Metricstore to Metricstore. The Scheduled SQL job processes the data in the source Metricstore and saves the processed data to the destination Metricstore.

      SQL Code

      The query statement. By default, the system displays the statement that you entered in Step 4. The preview operation that is provided for this parameter has the same effect as the preview operation in Step 4. You can click Preview to check whether the query statement is valid and whether the query and analysis results contain data.

      When the Scheduled SQL job runs, Simple Log Service executes the query statement to analyze data.

      Important

      We recommend that you use the promql_query function to import metric data of the current point in time to the destination Metricstore. If you use the promql_query_range function, a large amount of data expands.

      SQL Settings

      Result Metric Name

      The new name of the metric that you select for analysis. If you want to change the name of the metric, you can specify a new name for the metric in this parameter. For more information, see Metric.

      Important

      If you select a single metric for analysis, we recommend that you configure this parameter to rename the metric.

      If you select multiple metrics for analysis and you configure this parameter, all metrics are renamed with the name that you specify.

      Rehash

      Specifies whether to enable hashing. If you turn on Rehash, you can configure the Hash Column parameter to write data with the same label to one shard. This improves data locality and query efficiency.

      Valid values of the Hash Column parameter vary based on the existing label information of your metrics. For example, if the existing label information of your metrics is {"alert_id":"alert-1608815762-545495","alert_name":"Alert clearance disabled","status":"inactive"}, the valid values of the Hash Column parameter are alert_id, alert_name, and status. If you set Hash Column to status, the metrics that have the same value for status are written to the same shard.

      Additional Labels

      The static labels that are used to identify the attributes of a metric. Each label is a key-value pair.

      For example, you can set label_key to app and label_value to ingress-nginx.

      Target

      Target Region

      The region where the destination project resides.

      Target Project

      The name of the destination project that stores the results of the query statement.

      Target Store

      The name of the destination Metricstore that stores the results of the query statement.

      Write Authorization

      The method that is used to authorize the Scheduled SQL job to write data to the destination Metricstore. Valid values:

      • Default Role: The Scheduled SQL job assumes the AliyunLogETLRole system role to write the analysis results to the destination Metricstore.

        Important

        The first time that you create a Scheduled SQL job, authorization is required and must be completed by using the Alibaba Cloud account to which the destination project belongs. You do not need to perform authorization when you create subsequent Scheduled SQL jobs.

      • Custom Role: The Scheduled SQL job assumes a custom role to write the analysis results to the destination Metricstore.

        You must grant the custom role the permissions to write data to the destination Metricstore. Then, enter the Alibaba Cloud Resource Name (ARN) of the custom role in the Role ARN field. For information about how to obtain the ARN, see one of the following topics based on your business scenario:

      SQL Execution Authorization

      The method that is used to authorize the Scheduled SQL job to read data from the source Metricstore and analyze the data by using query statements in the current project. Valid values:

      • Default Role: The Scheduled SQL job assumes the AliyunLogETLRole system role to perform the required operations.

        Important

        The first time that you create a Scheduled SQL job, authorization is required and must be completed by using the Alibaba Cloud account to which the destination project belongs. You do not need to perform authorization when you create subsequent Scheduled SQL jobs.

      • Custom Role: The Scheduled SQL job assumes a custom role to perform the required operations.

        You must grant the custom role the required permissions. Then, enter the ARN of the custom role in the Role ARN field. For more information, see Step 1: Grant RAM Role A the permissions to analyze log data in a source Logstore.

    2. In the Scheduling Settings step, set the parameters and click OK.
      ParameterDescription
      Scheduling IntervalThe frequency at which the scheduled SQL task is scheduled. Each time the scheduled SQL task is scheduled, an instance is generated. This parameter determines the time when each instance is generated. Valid values:
      • Hourly: The scheduled SQL task is scheduled every hour.
      • Daily: The scheduled SQL task is scheduled at a fixed time every day.
      • Weekly: The scheduled SQL task is scheduled at a fixed time on a fixed day of each week.
      • Fixed Interval: The scheduled SQL task is scheduled at a fixed interval.
      • Cron: The scheduled SQL task is scheduled at an interval that is specified by using a cron expression.

        If you use a cron expression, the specified interval is accurate to minutes based on the 24-hour clock. For example, the expression 0 0/1 * * * indicates that the scheduled SQL task is scheduled every hour from 00:00.

        If you need to specify the time zone, select Cron. For a list of common time zones, see Time zones.

      Scheduling Time RangeThe time range during which the scheduled SQL task is scheduled. Valid values:
      • Start at a specified time: specifies the time when the scheduled SQL task is first scheduled.
      • Within Specific Period: specifies the time range within which the scheduled SQL task is scheduled.
      Important The scheduled SQL task is scheduled within the specified time range. After the end time, the scheduled SQL task no longer generates instances.
      SQL Time WindowWhen the scheduled SQL task runs, Log Service analyzes logs that are generated only within the SQL time window. This parameter and the Scheduling Time Range parameter take effect at the same time. The duration specified by this parameter can be up to five times the duration specified by Scheduling Interval. The start time and end time of the SQL time window must be within 24 hours. For more information, see Time expression syntax.

      For example, Scheduling Interval is set to Fixed Interval 10 Minutes, Start Time is set to 2021-04-01 00:00:00, Delay Task is set to 30 seconds, and SQL Time Window is set to [@m-10m,@m). In this case, the first instance is generated at 00:00:30 when the scheduled SQL task runs. The logs that are generated within the time range [23:50:00 to 00:00:00) are analyzed. For more information, see Scheduling and running scenarios.

      SQL TimeoutThe threshold of automatic retries if the SQL statement fails to analyze data. If the retry period exceeds the specified maximum period or the number of retries exceeds the limit, the scheduled SQL instance fails to be implemented. You can identify the failure cause and then manually rerun the instance. For more information, see Retry a scheduled SQL instance.
      Delay TaskThe number of seconds for which the instance is delayed from the scheduling time. Valid values: 0 to 120. Unit: seconds.

      This parameter can be used to ensure data integrity if data is written to a Logstore with a delay.

Sample SDKs

Use Log Service SDK for Java to create a Scheduled SQL task