All Products
Search
Document Center

Edge Security Acceleration:Identify and mitigate bots traffic

Last Updated:Oct 11, 2025

ESA bot management uses metric-based detection, risk scoring, and continuous learning to identify and mitigate automated traffic.

Identify bots traffic

Identifying automated traffic is key to effective bot management. ESA bot management provides several strategies. In Smart Mode, AI makes decisions automatically. In Professional Mode, you can configure methods like blacklists, whitelists, behavior analysis, and feature analysis.

AI-based decision-making

Smart Mode uses this method to classify different bots, combining AI models and massive intelligence data to score and grade requests by risk level. Based on different grades and scores, bot requests are classified as Verified Bots, Definite Bots, and Likely Bots.

You can set actions for each bot type:.

  • Verified Bots are requests with crawlers from search engines that help with your website's search engine optimization (SEO). The recommended action is Allow.

  • Definite Bots are requests from known malicious crawlers. They pose a significant threat to websites. The recommended action is Block or use Slider CAPTCHA.

  • Likely Bots have a threat level between the above two types. Some of this traffic consists of benign requests, while others are malicious crawlers. Use Slider CAPTCHA to filter out bot requests.

Whitelists

For harmless crawlers or safe requests, ESA supports whitelists to ensure they are not mistakenly blocked.

Policy

Scenarios

Legitimate search engine whitelist

Major search engines like Google and Bing use crawlers to index websites. This indexing is essential for your website’s ranking and brings organic traffic. For content-based or commercial websites, this is especially important. You can allow specific search engine crawlers to bypass bot management checks.

IP whitelist

You can add IP addresses to the IP whitelist to allow these requests to bypass bot protection.

Blacklists

For identified malicious bots, ESA supports blacklists for you to mark and block requests. You can use the blacklists to block traffic from unsafe or malicious IDCs.

Policy

Scenarios

IDC blacklist

If your visitors do not access your website from public cloud or data center networks, you can block data center requests to reduce risks. Be sure to whitelist trusted sources, such as Alipay or WeChat payment callbacks and monitoring programs.

Bot threat intelligence library

Block malicious bots by using a database of IP addresses collected by Alibaba Cloud. These IPs have shown repeated harmful crawling behavior across multiple users within a set period. Requests from these IPs will be blocked.

Identify bots through request characteristics

ESA detects unusual behavior by analyzing static properties of HTTP requests, such as headers, protocol integrity, and SSL/TLS fingerprints.

Policy

Scenarios

JavaScript challenge

Detects whether the client can execute JavaScript. Requests from non-browser tools that cannot execute JavaScript will be considered malicious and blocked.

Dynamic token challenge

When a user or client sends a request, ESA dynamically generates a one-time encrypted token based on the request context like the timestamp, user behavior, or IP address, and signs it to prevent tampering. Requests without a valid token or signature are blocked.

Identify bots through request behavior

Traffic behavior identification analyzes request patterns to distinguish between human users and automated bots. It detects abnormal behaviors such as high-frequency requests, regular intervals, and scanning activity, which are common indicators of bots.

Handle bots traffic

Basic protection policies

Policy

Scenarios

Allow

Allow access to recognized legitimate bot traffic, such as search engine crawlers and partner API calls.

Block

Block access from malicious bots, such as DDoS attacks and vulnerability scanning tools. We recommend you to block Definite Bots.

Monitor

Monitor and log suspicious but unverified bots.

Slider CAPTCHA

Distinguish human users from automated bots by requiring users to complete a simple sliding or clicking action.

Custom throttling

To allow certain bots but limit how often they access your site, you can control request frequency by IP address or session. If requests exceed your set threshold, protection actions will be applied. Configure these settings in Custom Throttling.

Note

Custom Throttling is only available in Professional Mode.

Configuration example 1

  • Requirement: Limit requests from the same IP address to a maximum of 10 within one minute; if this limit is exceeded, block the IP for one hour. Alternatively, allow up to 100 requests from the same IP within 30 minutes; if this threshold is exceeded, apply a slider CAPTCHA challenge for 10 minutes.

  • Configuration: Add the following two conditions (each condition is connected by OR logic, with a maximum of three conditions per entry):

    • Condition 1: Statistical Interval: 60, Threshold: 10, Action: Block, and Throttling Interval: 3600.

    • Condition 2: Statistical Interval: 1800, Threshold: 100, Action: Slider CAPTCHA, and Throttling Interval: 600.

    image

Configuration example 2

  • Requirement: Allow requests with api in the header a maximum of 100 visits within 10 minutes, otherwise block them for one minute.

  • Configuration: Statistical Interval: 600, Threshold: 100, Action: Block, and Throttling Interval: 60.image

Parameters and valid values

The valid values for configuration parameters in custom rate limiting can be referenced in the table below.

Parameter

Valid value

Statistical Interval (Seconds)

5 to 1800

Threshold (Times)

2 to 50000

Action

Options:

  • Block

  • Monitor

  • Slider CAPTCHA

Throttling Interval (Seconds)

60 to 86400