Rate limiting is a method to improve network security by limiting network traffic. It limits how many times someone can repeat an action in a given interval, such as trying to repeatedly ping a website resource. Rate limiting can help prevent malicious bot activity at the same time help reduce the load on web servers.
Crawlers crawl your website to improve its search engine ranking. However, they can sometimes send so many queries to the server that can make the service unavailable due to excessive load. Hence, You can limit the number of times a crawler can access your site by using this feature.
Web Crawlers scan and index websites. The best crawlers for increasing your site's indexing in search engines include google, bing, and others. There are a number of fake crawlers that can cause problems for your website. All fake Google and Bing crawlers will be blocked once this functionality is enabled.
You can whitelist the top crawlers to improve your website's indexing in search engines. The whitelisted crawlers will not be blocked by rate limiting if this feature is enabled.
If you are looking for anything which you cannot find, please drop us an email on firstname.lastname@example.org
Need Help? We are right here!