After you enable the log analysis feature in the Cloud Firewall console, you can perform log-related operations. For example, you can query and analyze collected logs in real time, view or edit dashboards, and configure alert rules for monitoring. This topic describes how to query and analyze logs.

Procedure

  1. Log on to the Cloud Firewall console. In the left-side navigation pane, choose Log Analysis > Log Analysis.
  2. On the Log Analysis page, click the Logs tab and click Log Delivery.
  3. In the Log Delivery tooltip, turn on the required switches.
  4. On the Logs tab, enter a query statement in the search box.

    Log Service defines various query and anlaysis syntax to meet the requirements for log query and analysis in complex scenarios. For more information, see Use custom statements to query and analyze logs.

  5. Click 15 Minutes(Relative) to specify a time range for the query.
    You can select a relative time or a time frame. You can also specify a custom time range. The time range that you can specify by using this method is accurate to the minute. If you want to use a time range that is accurate to the second, you must specify the time range in the analytic statement. Example: * | SELECT * FROM log WHERE __time__>1558013658 AND __time__< 1558013660.
    Note The query and analysis results may contain logs that are generated 1 minute earlier or later than the specified time range.
  6. Click Search & Analyze to view the query and analysis results.

Manage query and analysis results

You can view the query and analysis results in a log distribution histogram, on the Raw Logs tab, or by using a chart. You can also configure alerts and saved searches.
Note By default, only 100 rows of data are returned after you execute a query statement. You can use a LIMIT clause to change the number of returned rows. For more information, see LIMIT clause.
  • Log distribution histogram
    The log distribution histogram displays the distribution of query and analysis results in different time ranges. Log distribution histogram
    • If you move the pointer over a green rectangle, you can view the time range that is represented by the rectangle and the number of logs that are obtained within the time range.
    • If you click the green rectangle, you can view a more fine-grained log distribution. You can also view the query and analysis results on the Raw Logs tab.
  • Raw Logs tab
    On the Raw Logs tab, you can view the logs that match your search conditions. You can click Table or Raw Data to view the logs and perform the following operations: Raw Logs tab
    • Quick analysis: You can use this feature to analyze the distribution of a specific field within a specific period of time. For more information, see Quick analysis.
      You can click the Alias icon to specify whether to show the names or aliases of fields. You can specify aliases when you configure indexes. For example, if you set the alias of the host_name field to host, host is displayed in the Quick Analysis list after you select Show Field Aliases.
      Note If you do not specify an alias for a field, the field name is displayed after you select Show Field Aliases.
    • Contextual query: On the Raw Data tab, you can find a log and click the Query Logs - 004 icon to view the context of the log in the raw log file. For more information, see Context query.
      Note The contextual query feature supports only the log data that is collected by Logtail.
    • LiveTail: On the Raw Data tab, you can find a log and click the LiveTail icon to monitor logs in real time and extract important information from the logs. For more information, see LiveTail.
      Note LiveTail can monitor and extract only the log data that is collected by Logtail.
    • Tag configurations: On the Raw Data tab, you can click the Settings icon and select Tag Configurations to hide fields that are less important. Tag
    • Column settings: On the Table tab, you can click the Settings icon and select Column Settings to specify the columns that you want to display in the table. The column names are field names, and the column content is used as field values. Column settings
    • JSON configurations: On the Table or Raw Data tab, you can click the Settings icon and select JSON Configurations to specify the level for JSON expansion.
    • Event settings: On the Table or Raw Data tab, you can click the Settings icon and select Event Settings to configure events for raw logs. For more information, see Configure events.
    • Log download: On the Table or Raw Data tab, you can click the Download Log icon to download logs. You can specify the method that is used to download logs and the range of logs to download. For more information, see Download logs.
  • Graph tab
    After you execute a query statement, you can view the query and analysis results on the Graph tab.
    • View query and analysis results: Log Service renders the results of query statements into charts. Log Service provides multiple chart types, such as tables, line charts, and bar charts. Log Service provides the following versions of charts: Pro and Standard. For more information, see Overview of charts (Pro) and Chart overview.
    • Log Service allows you to create dashboards to perform real-time data analysis. You can click Add to New Dashboard to save query and analysis results as charts to a dashboard. For more information, see Overview of visualization.
    • Configure interaction occurrences: Interaction occurrences are important for data analysis. You can use interaction occurrences to switch between the levels of data dimensions and the analysis granularities to obtain more detailed information. For more information, see Drill-down events.
  • LogReduce tab

    On the LogReduce tab, you can click Enable LogReduce to cluster similar logs. For more information, see LogReduce.

  • Alerts

    On the Search & Analysis page, you can choose Save as Alert > New Alert to configure an alert rule based on the query and analysis results. For more information, see Configure an alert monitoring rule in Log Service.

  • Saved searches

    On the Search & Analysis page, you can click Save Search to save a query statement as a saved search. For more information, see Saved search.

Use custom statements to query and analyze logs

A query statement consists of a search statement and an analytic statement that are separated by a vertical bar (|).

$Search | $Analytics
Statement Description
Search statement A search statement uses syntax that is specific to log analysis. A search statement is used to query the logs that meet specified search conditions. Search conditions include keywords, fuzzy strings, numeric values, ranges, or combinations of these items. If a search statement is empty or contains only an asterisk (*), all logs are queried.
Analytic statement An analytic statement uses the SQL-92 syntax. An analytic statement is used to analyze search results. If an analytic statement is empty, only search results are returned. No analysis is performed on the results.

Search syntax

The search syntax of Log Service supports full-text search and field-specific search. The search box supports features such as multi-line search and syntax highlighting.
  • Full-text search

    You can enter a keyword to query logs. You do not need to specify exact fields. You can enclose a keyword in a pair of double quotation marks ("") to query the logs that contain the keywords. If you enter multiple keywords, you can separate them with spaces or by using and. This way, you can query the logs that contain the keywords.

    Examples
    • Query logs based on multiple keywords

      You can query the logs that contain both www.aliyun.com and error.

      www.aliyun.com error or www.aliyun.com and error

    • Query logs based on a condition

      You can query the logs that contain www.aliyun.com and contain error or 404.

      www.aliyun.com and (error or 404)
    • Query logs based on a prefix

      You can query the logs that contain www.aliyun.com and start with failed_.

      www.aliyun.com and failed_*
      Note An asterisk (*) can be added only as a suffix. An asterisk (*) cannot be added as a prefix. For example, *_error is not supported.
  • Field-specific search

    You can query logs based on fields.

    You can specify a numeric field in the Field name: Value or Field name >= Value format. In this case, comparison is performed to query logs You can also use operators to specify a combination of fields. The operators include and and or. In addition, you can use field-specific search together with full-text search.
    Note The log analysis feature of Cloud Firewall allows you to perform field-specific search to obtain logs. For more information about the definition, type, and format of each field, see Log fields.
    Examples
    • Query logs based on multiple fields

      You can use the following condition to query the logs on access requests from the client whose IP address is 192.0.2.0 to 192.0.2.54:

      src_ip: 192.0.2.0 and dst_ip: 192.0.2.54
      Note In the example, src_ip and dst_ip are log fields recorded by Cloud Firewall.
    • Query logs based on field existence
      • You can query the logs that contain the total_pps field.
        total_pps: *
      • You can query the logs that do not contain the total_pps field.
        not total_pps: *
For more information about the query syntax supported by Log Service, see Log search overview.

Analysis syntax

You can execute SQL-92 statements to analyze logs.

For more information about the analysis syntax and functions supported by Log Service, see Log analysis overview.
Note
  • You can omit the from Table name clause in standard SQL statements. This clause is equivalent to from log.
  • By default, the first 100 logs are returned. If you want to adjust this number, you can use a LIMIT clause. For more information, see LIMIT clause.

Examples of log query and analysis

Log query and analysis based on log time

Each log records a time field. The field indicates the time when the log was generated and is in the YYYY-MM-DDThh:mm:ss+Time zone format. Example: 2018-05-31T20:11:58+08:00, where the time zone is UTC+8.

Each log contains a built-in field __time__. The field also indicates the time when the log was generated. The time is a UNIX timestamp and is used in time-specific calculation. The value of the field is the number of seconds that have elapsed since the epoch time January 1, 1970, 00:00:00 UTC. If you want to display recognizable calculation results, you must convert the time format first. For more information about time parsing functions, see Date and time functions. For example, you can use the date_parse and date_format functions to convert a time format to another.