Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How fast can I actually crawl a website?

Tags:

web-crawler

I'm gonna crawl a website for some information. It's about 170 000+ pages. So, how many request can I make? I'm gonna extract til HTML and get some information. This is a already very popular site, so I don't think it would die if was just cruising fast over all pages... Only thing that makes me nervous is that I don't know if the ownser will block my IP or something if you do that? Is that normal? Should I just load 5 pages/min ? Then it will take forever... I want to get new data every 24 hour see.

Thanks for all response!

like image 924
IQlessThan70 Avatar asked Jan 21 '23 22:01

IQlessThan70


1 Answers

It will take sometime, actually I suggest you use rotating proxies, and add multi-threading. 10 threads will do. This way, you can have 10 requests at the same time. Using proxies will be slow though, and add timeout of atleast 1.5 secs each request, it will slow you down, but lowers the risk of getting banned.

like image 88
Ruel Avatar answered Mar 16 '23 17:03

Ruel