I'm using cURL to get some rank data for over 20,000 domain names that I've got stored in a database.
The code I'm using is http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading.
The array $competeRequests is 20,000 request to compete.com api for website ranks.
This is an example request: http://apps.compete.com/sites/stackoverflow.com/trended/rank/?apikey=xxxx&start_date=201207&end_date=201208&jsonp=";
Since there are 20,000 of these requests I want to break them up into chunks so I'm using the following code to accomplish that:
foreach(array_chunk($competeRequests, 1000) as $requests) {
foreach($requests as $request) {
$curl->addSession( $request, $opts );
}
}
This works great for sending the requests in batches of 1,000 however the script takes too long to execute. I've increased the max_execution_time to over 10 minutes.
Is there a way to send 1,000 requests from my array then parse the results then output a status update then continue with the next 1,000 until the array is empty? As of now the screen just stays white the entire time the script is executing which can be over 10 minutes.
The above accepted answer is outdated, So, correct answer has to be upvoted.
http://php.net/manual/en/function.curl-multi-init.php
Now, PHP supports fetching multiple URLs at the same time.
This one always does the job for me... https://github.com/petewarden/ParallelCurl
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With