2.20.5.3. Indexing Frequency Limit

Indexing bots make frequent requests to site pages to update information and save it in their databases for subsequent issuance in search engines. These bots are useful, but too high a scan rate can cause unnecessary useless load.

  1. Log in or register, if you do not have an account yet, in Google Search Console.
  2. Add and confirm domain in this system, if it has not been verified.
  3. Go to the management page scan rate settings and select the resource you want:
  4. Set the switch opposite «Limit Google's maximum crawl speed», set the desired value and press «Save»:
  1. Log in or register using social networks in Bing Webmaster.
  2. Add and confirm domain in this system, if it has not been verified.
  3. Select the site you want, go to the section «Configuration → Bypass control», select the required site visit frequency template, or select «Customizable» and specify the frequency of indexing by hour yourself, then click on the button «Save changes»:

Attention!

Some bots may completely ignore the Crawl-delay directive.

As a rule, indexing bots follow the directives specified in robots.txtwhere you can set the indexing interval. To do this, do the following:

  1. Create or edit a file if it already exists, robots.txt in root directory of the site.
  2. In file robots.txt specify the directive crawl-delay:
    User-Agent: *
    Disallow:
    Crawl-delay: 3

    Instead * you can specify a specific User-Agent for which the indexing frequency should be limited. The Crawl-delay parameter sets the interval in seconds between requests from the bot.

Content