After you enable the log analysis feature in the Cloud Firewall console, you can perform log-related operations. For example, you can query and analyze collected logs in real time, view or edit dashboards, and configure alert rules for monitoring. This topic describes how to query and analyze logs.
Procedure
Manage query and analysis results
- Log distribution histogram
The log distribution histogram displays the distribution of query and analysis results in different time ranges.
- If you move the pointer over a green rectangle, you can view the time range that is represented by the rectangle and the number of logs that are obtained within the time range.
- If you click the green rectangle, you can view a more fine-grained log distribution. You can also view the query and analysis results on the Raw Logs tab.
- Raw Logs tab
On the Raw Logs tab, you can view the logs that match your search conditions. You can click Table or Raw Data to view the logs and perform the following operations:
- Quick analysis: You can use this feature to analyze the distribution of a specific
field within a specific period of time. For more information, see Quick analysis.
You can click the
icon to specify whether to show the names or aliases of fields. You can specify aliases when you configure indexes. For example, if you set the alias of the host_name field to host, host is displayed in the Quick Analysis list after you select Show Field Aliases.
Note If you do not specify an alias for a field, the field name is displayed after you select Show Field Aliases. - Contextual query: On the Raw Data tab, you can find a log and click the
icon to view the context of the log in the raw log file. For more information, see Context query.
Note The contextual query feature supports only the log data that is collected by Logtail. - LiveTail: On the Raw Data tab, you can find a log and click the
icon to monitor logs in real time and extract important information from the logs. For more information, see LiveTail.
Note LiveTail can monitor and extract only the log data that is collected by Logtail. - Tag configurations: On the Raw Data tab, you can click the
icon and select Tag Configurations to hide fields that are less important.
- Column settings: On the Table tab, you can click the
icon and select Column Settings to specify the columns that you want to display in the table. The column names are field names, and the column content is used as field values.
- JSON configurations: On the Table or Raw Data tab, you can click the
icon and select JSON Configurations to specify the level for JSON expansion.
- Event settings: On the Table or Raw Data tab, you can click the
icon and select Event Settings to configure events for raw logs. For more information, see Configure events.
- Log download: On the Table or Raw Data tab, you can click the
icon to download logs. You can specify the method that is used to download logs and the range of logs to download. For more information, see Download logs.
- Quick analysis: You can use this feature to analyze the distribution of a specific
field within a specific period of time. For more information, see Quick analysis.
- Graph tab
After you execute a query statement, you can view the query and analysis results on the Graph tab.
- View query and analysis results: Log Service renders the results of query statements into charts. Log Service provides multiple chart types, such as tables, line charts, and bar charts. Log Service provides the following versions of charts: Pro and Standard. For more information, see Overview of charts (Pro) and Chart overview.
- Log Service allows you to create dashboards to perform real-time data analysis. You can click Add to New Dashboard to save query and analysis results as charts to a dashboard. For more information, see Overview of visualization.
- Configure interaction occurrences: Interaction occurrences are important for data analysis. You can use interaction occurrences to switch between the levels of data dimensions and the analysis granularities to obtain more detailed information. For more information, see Drill-down events.
- LogReduce tab
On the LogReduce tab, you can click Enable LogReduce to cluster similar logs. For more information, see LogReduce.
- Alerts
On the Search & Analysis page, you can choose Configure an alert monitoring rule in Log Service.
to configure an alert rule based on the query and analysis results. For more information, see - Saved searches
On the Search & Analysis page, you can click Save Search to save a query statement as a saved search. For more information, see Saved search.
Use custom statements to query and analyze logs
A query statement consists of a search statement and an analytic statement that are separated by a vertical bar (|).
$Search | $Analytics
Statement | Description |
---|---|
Search statement | A search statement uses syntax that is specific to log analysis. A search statement
is used to query the logs that meet specified search conditions. Search conditions
include keywords, fuzzy strings, numeric values, ranges, or combinations of these
items. If a search statement is empty or contains only an asterisk (* ), all logs are queried.
|
Analytic statement | An analytic statement uses the SQL-92 syntax. An analytic statement is used to analyze search results. If an analytic statement is empty, only search results are returned. No analysis is performed on the results. |
Search syntax
- Full-text search
You can enter a keyword to query logs. You do not need to specify exact fields. You can enclose a keyword in a pair of double quotation marks ("") to query the logs that contain the keywords. If you enter multiple keywords, you can separate them with spaces or by using
and
. This way, you can query the logs that contain the keywords.Examples- Query logs based on multiple keywords
You can query the logs that contain both
www.aliyun.com
anderror
.www.aliyun.com error
orwww.aliyun.com and error
- Query logs based on a condition
You can query the logs that contain
www.aliyun.com
and containerror
or404
.www.aliyun.com and (error or 404)
- Query logs based on a prefix
You can query the logs that contain
www.aliyun.com
and start withfailed_
.www.aliyun.com and failed_*
Note An asterisk (*
) can be added only as a suffix. An asterisk (*
) cannot be added as a prefix. For example,*_error
is not supported.
- Query logs based on multiple keywords
- Field-specific search
You can query logs based on fields.
You can specify a numeric field in theField name: Value
orField name >= Value
format. In this case, comparison is performed to query logs You can also use operators to specify a combination of fields. The operators includeand
andor
. In addition, you can use field-specific search together with full-text search.Note The log analysis feature of Cloud Firewall allows you to perform field-specific search to obtain logs. For more information about the definition, type, and format of each field, see Log fields.Examples- Query logs based on multiple fields
You can use the following condition to query the logs on access requests from the client whose IP address is
192.0.2.0
to192.0.2.54
:src_ip: 192.0.2.0 and dst_ip: 192.0.2.54
Note In the example,src_ip
anddst_ip
are log fields recorded by Cloud Firewall. - Query logs based on field existence
- You can query the logs that contain the
total_pps
field.total_pps: *
- You can query the logs that do not contain the
total_pps
field.not total_pps: *
- You can query the logs that contain the
- Query logs based on multiple fields
Analysis syntax
You can execute SQL-92 statements to analyze logs.
- You can omit the
from Table name
clause in standard SQL statements. This clause is equivalent tofrom log
. - By default, the first 100 logs are returned. If you want to adjust this number, you can use a LIMIT clause. For more information, see LIMIT clause.
Examples of log query and analysis
Log query and analysis based on log time
Each log records a time
field. The field indicates the time when the log was generated and is in the YYYY-MM-DDThh:mm:ss+Time zone
format. Example: 2018-05-31T20:11:58+08:00
, where the time zone is UTC+8
.
Each log contains a built-in field __time__
. The field also indicates the time when the log was generated. The time is a UNIX timestamp and is used in time-specific calculation. The value of the field is the number of
seconds that have elapsed since the epoch time January 1, 1970, 00:00:00 UTC. If you
want to display recognizable calculation results, you must convert the time format
first. For more information about time parsing functions, see Date and time functions. For example, you can use the date_parse
and date_format
functions to convert a time format to another.