This feature maintains a whitelist of authorized search engines, such as Google, Bing, Baidu, Sogou, 360, and Yandex, to facilitate the management of crawler requests that are forwarded to the target domain.

Notice This topic uses the new version of the Web Application Firewall (WAF) console released in January 2020. If your WAF instance was created before this date, you cannot use this feature.

Prerequisites

  • A Web Application Firewall instance that is deployed in a region inside mainland China and the Bot Manager feature are available.Bot Manager
  • The website is associated with the Web Application Firewall instance. For more information, see Add domain names.

Background information

Rules described in this topic allow requests from specific crawlers to the target domain based on the Alibaba Cloud crawler library. The Alibaba Cloud crawler library is updated in real time based on the analysis of network traffic that flows through Alibaba Cloud, and captures the characteristics of requests that are initiated from crawlers. The crawler library is updated dynamically and contains crawler IP addresses of mainstream search engines, including Google, Baidu, Sogou, 360, Bing, and Yandex.

After you enable a rule that allows requests from specific crawlers to the target domain, requests initiated from the crawler IP addresses of the authorized search engines are directly sent to the target domains. The bot management module no longer detects these requests.
Note Alternatively, you can use other protection features, such as ACL rules and traffic throttling rules, to filter requests from IP addresses of crawlers that are in the whitelist.

Procedure

  1. Log on to the Web Application Firewall console.
  2. In the top navigation bar, select the resource group to which the instance belongs and the region, Mainland China or International, in which the instance is deployed.
  3. In the left-side navigation pane, choose Protection Settings > Website Protection.
  4. In the upper part of the Website Protection page, select the domain name for which you want to configure the whitelist.Switch Domain Name
  5. Click the Bot Management tab and find the Allowed Crawlers section. Turn on the Status switch and click Settings.Allowed crawlers
  6. In the Allowed Crawlers list, find the target rule by Intelligence Name, and turn on the Status switch.Set a rule to allow requests from specific crawlers
    The default rules only allow crawler requests from the following search engines: Google, Bing, Baidu, Sogou, 360, and Yandex. You can enable the Legit Crawling Bots rule to allow requests from all search engine crawlers.