The Scheduled SQL feature analyzes data at scheduled times. It also aggregates, projects, and filters data for storage. This topic describes how to create a Scheduled SQL job in Simple Log Service. The job processes data from a source Logstore and stores the results in a destination Logstore.
The Scheduled SQL feature is in public preview and is free of charge. After the public preview ends, you will be charged for the computing resources consumed by Dedicated SQL. For more information about billing, see Billable items for the pay-by-feature billing model.
Prerequisites
Prepare a source Logstore
Create a Simple Log Service project and a standard Logstore. Then, collect logs and create indexes. For more information, see Manage a project, Create a standard Logstore, Data collection overview, and Create indexes.
Prepare a destination Logstore
In the Alibaba Cloud account where you want to store the job results, create a Simple Log Service project and a standard Logstore. Then, enable indexing.
Procedure
Log on to the Simple Log Service console using an Alibaba Cloud account or a Resource Access Management (RAM) user that has the permissions to create Scheduled SQL jobs.
In the Projects section, click the project that contains the source Logstore.
In the navigation pane on the left, click Log Storage. In the Logstores list, click the name of the Logstore that you want to analyze. This Logstore is the source Logstore for the Scheduled SQL job.
Enter a query statement. Then, click Last 15 Minutes to set the time range for the query.
NoteThis step lets you preview the data for the Scheduled SQL job. You can verify that your query statement is correct and that the results contain data.
On the Graph tab, click Save As Scheduled SQL.

Create the Scheduled SQL job.
In the Compute Settings wizard, configure the following parameters and click Next.
Parameter
Description
Job Name
The unique name of the Scheduled SQL job. You can keep the default name.
Display Name
Enter a display name for the Scheduled SQL job.
Job Description
(Optional) Enter a description for the Scheduled SQL job.
Resource Pool
Simple Log Service provides enhanced resource pools for data analytics.
These pools use the computing power of Dedicated SQL. They provide high concurrency and isolate resources from your SQL analysis operations in the console. You are charged for enhanced resource pools based on the CPU time that your SQL analysis operations consume. For more information, see High-performance and fully accurate query and analysis (Dedicated SQL).
Write Mode
Select Import Data From Logstore To Logstore. This means the job processes data from the source Logstore and stores the results in the destination Logstore.
SQL Code
Displays the query statement that you entered in Step 4.
You can also define a different query statement here. After you enter the statement, select a time range and click Preview to confirm the results.
When the Scheduled SQL job runs, Simple Log Service executes this query statement to analyze the data.
Source Project/Logstore
Displays the project and Logstore of the data source. This parameter cannot be modified.
Target
Source and destination Logstores in the same account
Parameter
Description
Destination Region
Select the region where the Target Project is located.
Destination Project
Select the Destination Project from the drop-down list.
Target Store
Select the destination Logstore from the drop-down list.
Write Authorization
The Scheduled SQL job can assume a Default Role or a Custom Role to write data to the destination Logstore. Select one of the roles.
Default Role: The AliyunLogETLRole role has the permissions to run SQL analysis in the source Logstore or MetricStore and write the results to the destination Logstore or MetricStore. For more information, see Use the default role to create a Scheduled SQL job.
Custom Role: You can create a custom role and a custom policy for fine-grained permission management. For more information, see Grant a custom RAM role the permissions to write to a destination Logstore.
Source and destination Logstores in different accounts
Parameter
Description
Destination Region
Select Other Regions and accept the Compliance Warranty On Cross-border Data Transfer. Then, enter the public endpoint of the destination project, for example,
cn-hangzhou.log.aliyuncs.com.Destination Project
Enter the name of the destination project, for example,
test-project.Target Store
Enter the name of the destination Logstore, for example,
test-logstore.Write Authorization
Select Custom Role. For more information, see Grant a custom RAM role the permissions to write to a destination Logstore.
SQL Execution Authorization
The Scheduled SQL job can assume a Default Role or a Custom Role to query and analyze data in the source Logstore. Select one of the roles.
Default Role: The AliyunLogETLRole role has the permissions to run SQL analysis in the source Logstore or MetricStore and write the results to the destination Logstore or MetricStore. For more information, see Use the default role to create a Scheduled SQL job.
Custom Role: You can create a custom role and a custom policy for fine-grained permission management. For more information, see Grant a custom RAM role the permissions to analyze a source Logstore.
In the Scheduling Settings wizard, configure the following parameters and click OK.
Parameter
Description
Scheduling Interval
The frequency at which the job is scheduled. An instance is created for each run, and the interval determines the scheduled time of each instance.
Fixed Interval: Schedules the job at a fixed interval.
Cron: Schedules the job at an interval specified by a cron expression.
A cron expression is accurate to the minute and uses a 24-hour clock. For example,
0 0/1 * * *runs the job every hour, starting at 00:00.To configure a time zone, you must select the Cron mode. For a list of common time zones, see Time zones.
Scheduling Time Range
The time range during which the job is scheduled.
From Specific Time: Specifies the start time for the first job instance.
Specific Time Range: Specifies a start and end time. The job runs only within this time range.
NoteJob instances are scheduled only within this time range. New instances are not created outside this range.
The scheduling time range refers to
__time__. For more information, see Reserved fields.
SQL Time Window
The time window for the logs to be analyzed when the job runs. This time window works in conjunction with the scheduling interval. The time window cannot be more than five times the Scheduling Interval and cannot exceed one day. For more information, see Time expression syntax.
For example, if Scheduling Interval is Fixed Interval 10 Minutes, Start At is 2021-04-01 00:00:00, Delay Task is 30 Seconds, and SQL Time Window is [@m-10m,@m), the first instance is created at 00:00:30. It analyzes logs from the
[23:50:00~00:00:00)time range. For more information, see Scheduling and execution scenarios.NoteThe SQL time window refers to
__time__. For more information, see Reserved fields.If you do not define
__time__in the SQL code, the__time__value for the log written to the destination Logstore defaults to the start time of the scheduling window.
SQL Timeout
The threshold for automatic retries when an SQL analysis operation fails. If the retry duration exceeds the maximum time or the number of retries exceeds the maximum count, the instance stops and its status changes to FAILED. You can manually retry a failed instance. For more information, see Retry a Scheduled SQL job instance.
Delay Task
The amount of time to delay the execution after the scheduled time. The valid values are 0 to 120 seconds.
If there is a delay when writing data to the Logstore, you can use this parameter to ensure data integrity.
After you create the Scheduled SQL job, you can view it by choosing . For more information, see Manage Scheduled SQL jobs.