Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Slow down spidering of website

Is there a way to force a spider to slow down its spidering of a website? Anything that can be put in headers or robots.txt?

I thought i remembered reading something about this being possible but cannot find anything now.

like image 260
Bryan Migliorisi Avatar asked Jan 29 '10 22:01

Bryan Migliorisi


1 Answers

If you're referring to Google, you can throttle the speed at which Google spiders your site by using your Google Webmaster account (Google Webmaster Tools).

There is also this, which you can put in robots.txt

User-agent: *
Crawl-delay: 10

Where crawl delay is specified as the number of seconds between each page crawl. Of course, like everything else in robots.txt, the crawler has to respect it, so YMMV.

like image 145
Robert Harvey Avatar answered Nov 14 '22 06:11

Robert Harvey