I'm gonna crawl a website for some information. It's about 170 000+ pages. So, how many request can I make? I'm gonna extract til HTML and get some information. This is a already very popular site, so I don't think it would die if was just cruising fast over all pages... Only thing that makes me nervous is that I don't know if the ownser will block my IP or something if you do that? Is that normal? Should I just load 5 pages/min ? Then it will take forever... I want to get new data every 24 hour see.
Thanks for all response!
It will take sometime, actually I suggest you use rotating proxies, and add multi-threading. 10 threads will do. This way, you can have 10 requests at the same time. Using proxies will be slow though, and add timeout of atleast 1.5 secs each request, it will slow you down, but lowers the risk of getting banned.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With