Anti-DDoS Pro is embedded in the Full log page of Log Service in the Log analysis and Log report. After you have enabled the DDoS log protection function for a specific website, you can query and analyze the collected log data in real time on the current page, view or edit the dashboard, and set monitoring alarms.

Procedure

  1. Log on to the Anti-DDoS Pro console, and select Log > Full log in the left-side navigation pane.
  2. Select the website for which you want enable DDoS log collection protection, then confirm the Status is on.
  3. Click Log analysis.

    The current page is embedded in the Query analysis page of Log Service, and the system automatically enters the query statement for you, such as matched_host: www.aliyun.com, to view the log data based on the selected website.

    Figure 1. Log analysis


  4. Enter query analysis statement, select the log time range and click Query.
    Note The default storage time of DDoS logs is three days. After three days, the log data is deleted. By default, you can only query log data for the past three days. To modify the log storage time, see Modify log storage time.
    Figure 2. Log query 


On the Query and Analysis page, you can also perform the following operations.
  • Custom query and analysis

    Log Service provides different query and analysis syntaxes to support log queries in various complex scenarios. For more information, see Custom query and analysis.

  • View the log time distribution
    Under the search box, the time distribution of the log matching the query time and the query statement is displayed. Time distribution is displayed in the form of a histogram with the horizontal and vertical axis.  The total number of queried logs is displayed. 
    Note You can slide the histogram to select a smaller range of time zones, and the time picker automatically updates the selected time range and refresh the results.
    Figure 3. Log time distribution


  • View the raw logs

    In the Raw log, the details of each log are displayed in pagination, including time and content of these fields. You can sort the columns, download the current query results, or click the gear to select specific fields to be displayed.

    Click on the value or part of the corresponding field in the page to automatically enter the appropriate search criteria in the search box. For example, click the value GET in request_method: GET, the following statement is automatically added to the search box:

    Raw search statement and request_method: GET
    Figure 4. Raw logs


  • View analysis charts

    Log Service supports graphical presentation of the analysis results, you can select different chart types on the Statistics Chart page. For more information, see Analysis charts.

    Figure 5. Statistic chart


  • Quick analysis

    Quick analysis feature provides one-click interactive query that helps you quickly analyze the distribution of a field over a period of time and reduce the time cost of indexing critical data.  For more information, see Quick analysis.

    Figure 6. Quick analysis


Custom query analysis

Log query statement consists of two parts: query syntax (Search) and analysis syntax (Analytics), which are divided by |:

$Search | $Analytics
Type  Description
Query (Search) The query conditions can be generated by keywords, fuzzy, numerical values, interval range and combination conditions. If left empty or *, all data is displayed.
Analysis (Analytics) Calculate and count the query results or the full amount of data.
Note Both Search and Analytics are optional. If Search is empty, all the data in the specified period is not filtered and the results are counted directly.  If Analytics is empty, the query results are returned and no statistics are collected.

Query syntax

Log Service query syntax supports Full-text query and Field query. Query box supports line break display, syntax highlighting, and other functions.
  • Full-text query

    You do not need to specify a field to enter the keyword query directly.  You can wrap a keyword in double quotation marks (""), separated by a space or by and between multiple keywords.

    Example
    • Multiple keywords query 

      Search for logs containing www.aliyun.com and error. For example: 

      www.aliyun.com error

      or

      www.aliyun.com and error
    • Conditional query

      Search for logs containing www.aliyun.com and including error or 404. For example: 

      www.aliyun.com and (error or 404)
    • Prefix query

      Search for all keywords that contain www.aliyun.com and start with failed_. For example: 

      www.aliyun.com and failed_*
      Note Query only supports suffix plus *, does not support prefix *, such as *_error.
  • Field query

    Log Service supports more accurate queries based on fields. 

    A comparison of numeric type fields can be implemented in the format field:value or field>=value, using and, or. It can also be combined with full-text search, also by using the combination of and and or.

    DDoS website access log and attack log can also base on field query. For the meaning, type, format and other information of each field, see DDoS log field.

    Example
    • Multiple fields query

      Search for logs containing www.aliyun.com attacked by CC:

      matched_host: www.aliyun.com and cc_blocks: 1

      Search the access logs containing the error 404 of a client 1.2.3.4 on the website www.aliyun.com

      real_client_ip: 1.2.3.4 and matched_host: www.aliyun.com and status: 404
      Note Fields used in the examples matched_host, cc_blocks, real_client_ip, and status are fields of DDoS access and attack logs. For more information about fields, see DDoS log fields.
    • Numeric field query

      Search for all slow request logs with a response time of more than 5 seconds:

      request_time_msec > 5000

      Interval queries are also supported, querying logs with a response time greater than 5 seconds and less than or equal to 10 seconds: 

      request_time_msec in (5000 10000]

      The query can also be performed by the following statement:

      request_time_msec > 5000 and request_time_msec <= 10000
    • Check whether Japanese characters are used.

      Query for the presence of specific fields:

      • Query logs in the ua_browser field: ua_browser: *.
      • Query logs that do not belong to the ua_browser field: not ua_browser: *
For more information about query syntax, see Index and query.

Analysis syntax

You can use the SQL/92 syntax for log data analysis and statistics. For more information about the syntax and functions supported by Log Service, see Syntax description.
Note
  • The from table name statement in the SQL standard syntax can be omitted from the analysis statement,  that is, from log.
  • Log data returns the first 100 entries by default, and you can modify the return range by LIMIT syntax.

Time-based log query analysis

Each DDoS log has a time field, in the format year-month-day T hour: minute: second + time zone. For example, 2018-05-31T20:11:58+08:00, where the time zone is UTC+8, that is Beijing time.  At the same time, each log has a built-in field: __time__, which also indicates the time of this log, so that time-based calculations can be performed in statistics. The format is Unix timestamp. The essence is a cumulative number of seconds since the 1970- 1 0:0:0 UTC time.  Therefore, in actual use, after calculation, time must be formatted before it can be displayed.
  • Select and show time
    Over a specific period of time, select the latest 10 logs of the website www.aliyun.com attacked by CC, show the time, source IP and access client, using the time field directly:
    matched_host: www.aliyun.com and cc_blocks: 1 
    | select time, real_client_ip, http_user_agent
        order by time desc
        limit 10
  • Calculation time
    To query the number of days after the CC attack,  use __time__ to calculate: 
    matched_host: www.aliyun.com and cc_blocks: 1 
    | select time, 
              round((to_unixtime(now()) - __time__)/86400, 1) as "days_passed", real_client_ip, http_user_agent
          order by time desc
          limit 10
    Note Use round((to_unixtime(now()) - __time__)/86400, 1), first part to_unixtime, the time obtained by now(), is converted to a Unix timestamp, and subtracted from the built-in time field __time__ to get the number of seconds that have passed.  Finally, divide by 86400, which is the total number of seconds in a day, and then round it to the decimal with the function round(data, 1). One-digit value indicates that each attack log has passed a few days.         
  • Group statistics based on specific time
    If you want to know how a website is being attacked by CC every day for a specific time frame, use the following SQL:
    matched_host: www.aliyun.com and cc_blocks: 1 
    | select date_trunc('day', __time__) as dt, 
             count(1) as PV 
          group by dt 
    	  order by dt
    Note This example uses the built-in time field __time__ to pass the function date_trunc('day', ..) to the time alignment. Each log is grouped into the partition of the day it belongs to for the total number of statistics (count(1)) and sorted by partition time block. The first argument of the function date_trunc provides alignment for other units, including second, miniute, hour, week, month, year. For more information about function, see Date and time functions.
  • Time-based group statistics
    For more flexible grouping time rules, for example, to know the trend of a website being attacked by CC every five minutes the math calculations are required. Run the following SQL:
    matched_host: www.aliyun.com and cc_blocks: 1 
    | select from_unixtime(__time__ - __time__% 300) as dt, 
             count(1) as PV 
          group by dt 
    	  order by dt 
    	  limit 1000
    Note Use the built-in time field to calculate __time__ - __time__% 300 and format it using the function from_unixtime. Each log is grouped into a 5 minute (300 seconds) partition for the total number of statistics (count(1)), and sorted by partition time block to obtain the first 1000 logs, which is equivalent to the first 83 hours of data in the selection time.
More time-resolved functions, such as converting a time format, require using date_parse and date_format. For more information, see Date and time functions.

Client IP-based query analysis 

DDoS log has a field real_client_ip. However, if the user cannot obtain the real IP by the proxy and the IP address in the header is incorrect, you can use the remote_addr field to directly connected to the client IP. 
  • Country attack distribution 
    Distribution of source countries of CC attacks on a website:
    matched_host: www.aliyun.com and cc_blocks: 1 
    | SELECT ip_to_country(if(real_client_ip='-', remote_addr, real_client_ip)) as country, 
             count(1) as "number of attacks" 
    		 group by country
    Note Use the function if(condition, option1, option2) to select the field real_client_ip or real_client_ip (when real_client_ip is -). Pass the obtained IP to the function ip_to_country to get the country information corresponding to this IP.
  • Access distribution
    To get more detailed province-based distribution, use the ip_to_province function, for example:
    matched_host: www.aliyun.com and cc_blocks: 1 
    | SELECT ip_to_province(if(real_client_ip='-', remote_addr, real_client_ip)) as province, 
             count (1) as "number of attacks" 
    		 group by province
    Note Another IP function ip_to_province to get a province of IP. If IP address is outside of China, system still tries to convert to the province (state), .
  • Attackers heat distribution
    To get an attackers heat map, use the ip_to_geo  function, for example:
    matched_host: www.aliyun.com and cc_blocks: 1 
    | SELECT ip_to_geo(if(real_client_ip='-', remote_addr, real_client_ip)) as geo, 
             count (1) as "number of attacks" 
    		 group by geo
    		 limit 10000
    Note Use another IP function ip_to_geo to get the latitude and longitude of an IP and get the first 10,000.
More IP-based parsing functions, such as obtaining the IP operator ip_to_provider, determining whether the IP is Internet or Intranet ip_to_domain, see IP functions.