Is there a way to force a spider to slow down its spidering of a website? Anything that can be put in headers or robots.txt?
I thought i remembered reading something about this being possible but cannot find anything now.
If you're referring to Google, you can throttle the speed at which Google spiders your site by using your Google Webmaster account (Google Webmaster Tools).
There is also this, which you can put in robots.txt
User-agent: *
Crawl-delay: 10
Where crawl delay is specified as the number of seconds between each page crawl. Of course, like everything else in robots.txt, the crawler has to respect it, so YMMV.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With