Search engines use programs or scripts to collect content on the Internet based on specific rules in an automated manner. The programs or scripts are known as web crawlers, search engine spiders, or search engine robots.
A search engine robot consumes server traffic and bandwidth when accessing a website. To reduce server load from the search engine robot, you can configure a DNS record for the website domain and specify the search engine robot as the ISP line.
If you want to temporarily close your website for search engine optimization (SEO) without affecting the website ranking, you can specify the search engine robot as the ISP line. This allows the search engine robot to index your website from the specified IP address.
For example, you can specify Baiduspider bot as the ISP line and configure an A record to direct Google Robots to the IP address of your Web server, such as 192.0.2.0. Google Robots will establish a connection with the Web server and collect the content of your website. Users can retrieve pictures and videos on your website by using the Baidu search engine.
Log on to the Alibaba Cloud DNS console.
1 . In the left-side navigation pane, click Manage DNS.
On the Authority Domains tab, click a domain name to open the DNS Settings page.
3 . On the DNS Settings page, click Add Record. In the Add Record dialog box, select Search Engine Robots from the ISP Line drop-down list and specify a search engine robot.