All Products
Search
Document Center

Simple Log Service:Create indexes to query and analyze logs

Last Updated:Apr 19, 2025

Simple Log Service provides the log query and analysis features. The features are implemented by using the SQL syntax. You can create indexes to query and analyze logs. This topic describes the basic syntax and limits of search and analytic statements. This topic also describes the SQL functions that you can call when you use the log query and analysis features.

Reading guide

Query and analysis

You can query billions to hundreds of billions of logs within seconds and use the SQL syntax to perform statistical analysis on query results. A search statement can be independently executed. An analytic statement must be executed together with a search statement. The log analysis feature is used to analyze data in the search results or all data in a Logstore.

Basic syntax

Each query statement consists of a search statement and an analytic statement. The search statement and the analytic statement are separated with a vertical bar ( |). A search statement can be independently executed. An analytic statement must be executed together with a search statement. The log analysis feature is used to analyze data in the search results or all data in a Logstore.

Search statement|Analytic statement

Statement

Description

Search statement

  • A search statement specifies one or more search conditions and returns the logs that meet the specified conditions. Format: Search statement. Example: status: 200.

  • A search statement can be a keyword, a numeric value, a numeric value range, a space, or an asterisk (*). If you specify a space or an asterisk (*) as the search statement, no conditions are used for searching and all logs are returned. For more information, see Search syntax and functions.

Important

We recommend that you specify up to 30 search conditions in a search statement.

Analytic statement

  • If you want to analyze logs, you must collect the logs to a Standard Logstore and turn on Enable Analytics for the required fields when you create indexes.

  • An analytic statement is used to aggregate or analyze data in the search results or all data in a Logstore. For more information about the functions and syntax supported by Simple Log Service for analyzing logs, see the following topics:

    • SQL functions: In most cases, SQL functions are used to calculate, convert, and format data. For example, you can use SQL functions to calculate the sum and average of values, perform operations on strings, and process dates.

    • SQL clauses: SQL clauses are used to create complete SQL search statements or data processing statements to identify the sources, conditions, groups, and orders of data.

    • Subqueries: A subquery is a query in which a SELECT statement is nested inside another SELECT statement. Subqueries are used in complex analysis scenarios.

    • Join query and analysis operations on a Logstore and a MySQL database: You can use the JOIN syntax to query data from a Logstore and a MySQL database. The query results are saved to the database.

    • Use SPL to query and analyze logs: You can use Simple Log Service Processing Language (SPL) to extract structured data, process fields, and filter data in logs.

Important
  • You do not need to specify the FROM or WHERE clause in an analytic statement. By default, all data of the current Logstore is analyzed.

  • Analytic statements do not support offsets and are not case-sensitive. You do not need to append a semicolon (;) to an analytic statement.

Example

* | SELECT status, count(*) AS PV GROUP BY status

The following figure shows the query and analysis results.

image

Advanced features

  • LiveTail: monitors online logs in real-time to reduce O&M workloads.

  • LogReduce: extracts patterns from similar logs during log collection to efficiently understand the logs.

  • Contextual query: views the context information of a specific log, facilitating troubleshooting and issue identification.

  • Field analysis: provides field distribution, statistical metrics, and top 5 time series charts to help understand data.

  • Event settings: easily obtains detailed information about raw logs based on the event settings.

  • Overview of Storeviews: performs cross-region and cross-store JOIN query operations by using Storeviews.

Limits on the query feature

Item

Description

Number of keywords

The number of keywords that are used as search conditions. The number of logical operators is not included. You can specify up to 30 keywords in a search statement.

Size of a field value

The maximum size of a field value is 512 KB. The excess part is not involved in searching.

If the size of a field value is greater than 512 KB, logs may fail to be obtained by using keywords, but the logs are actually stored in the Logstore.

Maximum number of concurrent search statements

Each project supports up to 100 concurrent search statements.

For example, 100 users can concurrently execute search statements in all Logstores of a project.

Returned result

The returned logs are displayed on multiple pages. Each page displays up to 100 logs.

Fuzzy search

In a fuzzy search, Simple Log Service matches up to 100 words that meet the specified conditions and returns the logs that meet the search conditions and contain one or more of these words. For more information, see Fuzzy search.

Data sorting in search results

By default, search results are displayed in descending order of the time, which is accurate to the second. If the search results are returned within nanoseconds, the search results are displayed in descending order of the time, which is accurate to the nanosecond.

Limits of the analysis feature

Limit

Standard instance

Dedicated SQL instance

SQL enhancement

Complete accuracy

Concurrency

Up to 15 concurrent queries per project.

Up to 100 concurrent queries per project.

Up to 5 concurrent queries per project.

Data volume

A single query can scan up to 400 MB of log data (excluding cached data). Data exceeding this limit is truncated and marked as incomplete query results.

A single query can scan up to 2 GB of log data (excluding cached data). Data exceeding this limit is truncated and marked as incomplete query results.

Unlimited.

Method to enable

By default, the log analysis feature is enabled.

A switch is provided for you to manually enable Dedicated SQL.

A switch is provided for you to manually enable Dedicated SQL.

Fee

Free of charge.

You are charged based on the actual CPU time.

You are charged based on the actual CPU time.

Data effectiveness mechanism

You can analyze only the data that is written to Log Service after the log analysis feature is enabled.

If you need to analyze historical data, you must reindex the historical data.

You can analyze only the data that is written to Log Service after the log analysis feature is enabled.

If you need to analyze historical data, you must reindex the historical data.

You can analyze only the data that is written to Log Service after the log analysis feature is enabled.

If you need to analyze historical data, you must reindex the historical data.

Return results

By default, analysis returns up to 100 rows and 100 MB of data. Exceeding 100 MB results in an error.

If you need to return more data, use the LIMIT clause.

By default, analysis returns up to 100 rows and 100 MB of data. Exceeding 100 MB results in an error.

If you need to return more data, use the LIMIT clause.

By default, analysis returns up to 100 rows and 100 MB of data. Exceeding 100 MB results in an error.

If you need to return more data, use the LIMIT clause.

Maximum field length

The maximum value is 16,384 bytes (16 KB). Data beyond this limit is not analyzed.

Note

The default value is 2,048 bytes (2 KB). To change this limit, adjust Maximum Field Length. Changes apply only to new data. For more information, see Create indexes.

The maximum value is 16,384 bytes (16 KB). Data beyond this limit is not analyzed.

Note

The default value is 2,048 bytes (2 KB). To change this limit, adjust Maximum Field Length. Changes apply only to new data. For more information, see Create indexes.

The maximum value is 16,384 bytes (16 KB). Data beyond this limit is not analyzed.

Note

The default value is 2,048 bytes (2 KB). To change this limit, adjust Maximum Field Length. Changes apply only to new data. For more information, see Create indexes.

Timeout period

The maximum timeout period for an analysis operation is 55 seconds.

The maximum timeout period for an analysis operation is 55 seconds.

The maximum timeout period for an analysis operation is 55 seconds.

Number of bits for double-type field values

Double-type field values are limited to 52 bits. Exceeding this can lead to precision loss in floating-point numbers.

Double-type field values are limited to 52 bits. Exceeding this can lead to precision loss in floating-point numbers.

Double-type field values are limited to 52 bits. Exceeding this can lead to precision loss in floating-point numbers.

FAQ

References