2.20.5.3. Indexing Frequency Limit
Indexing bots make frequent requests to site pages to update information and save it in their databases for subsequent issuance in search engines. These bots are useful, but too high a scan rate can cause unnecessary useless load.
- Log in or register, if you do not have an account yet, in Google Search Console.
- Add and confirm domain in this system, if it has not been verified.
- Go to the management page scan rate settings and select the resource you want:
Bing
- Log in or register using social networks in Bing Webmaster.
- Add and confirm domain in this system, if it has not been verified.
Crawl-delay
Attention!
Some bots may completely ignore the Crawl-delay directive.As a rule, indexing bots follow the directives specified in robots.txtwhere you can set the indexing interval. To do this, do the following:
- Create or edit a file if it already exists, robots.txt in root directory of the site.
- In file robots.txt specify the directive crawl-delay:
User-Agent: * Disallow: Crawl-delay: 3
Instead
*
you can specify a specific User-Agent for which the indexing frequency should be limited. The Crawl-delay parameter sets the interval in seconds between requests from the bot.