Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

cURL Multi Threading with PHP

Tags:

php

curl

nginx

I'm using cURL to get some rank data for over 20,000 domain names that I've got stored in a database.

The code I'm using is http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading.

The array $competeRequests is 20,000 request to compete.com api for website ranks.

This is an example request: http://apps.compete.com/sites/stackoverflow.com/trended/rank/?apikey=xxxx&start_date=201207&end_date=201208&jsonp=";

Since there are 20,000 of these requests I want to break them up into chunks so I'm using the following code to accomplish that:

foreach(array_chunk($competeRequests, 1000) as $requests) {
    foreach($requests as $request) {
        $curl->addSession( $request, $opts );
    }

}

This works great for sending the requests in batches of 1,000 however the script takes too long to execute. I've increased the max_execution_time to over 10 minutes.

Is there a way to send 1,000 requests from my array then parse the results then output a status update then continue with the next 1,000 until the array is empty? As of now the screen just stays white the entire time the script is executing which can be over 10 minutes.

like image 404
user1647347 Avatar asked Sep 12 '12 18:09

user1647347


2 Answers

The above accepted answer is outdated, So, correct answer has to be upvoted.

http://php.net/manual/en/function.curl-multi-init.php

Now, PHP supports fetching multiple URLs at the same time.

like image 146
Mani Avatar answered Sep 18 '22 15:09

Mani


This one always does the job for me... https://github.com/petewarden/ParallelCurl

like image 36
Glenn Plas Avatar answered Sep 18 '22 15:09

Glenn Plas