I have been following this tutorial on how to use curl_multi
. http://arguments.callee.info/2010/02/21/multiple-curl-requests-with-php/
I can't tell what I am doing wrong, but curl_multi_getcontent
is returning null. It is suppose to return JSON. I know it is not the mysql call as I had it working with a while loop and standard curl_exec
, but The page was taking too long to load. (I've changed some of the setopt details for security)
Relevant PHP Code snippet. I do close the while loop in the end.
$i = 0;
$ch = array();
$mh = curl_multi_init();
while($row = $result->fetch_object()){
$ch[$i] = curl_init();
curl_setopt($ch[$i], CURLOPT_CAINFO, 'cacert.pem');
curl_setopt($ch[$i], CURLOPT_USERPWD, "$username:$password");
curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$i], CURLOPT_URL, 'https://mysite.com/search/'.$row->username.'/');
curl_multi_add_handle($mh, $ch[$i]);
$i++;
}
$running = 0;
do {
curl_multi_exec($mh, $running);
} while ($running > 0);
$result->data_seek(0);
$i = 0;
while ($row = $result->fetch_object()) {
$data = curl_multi_getcontent($ch[$i]);
$json_data = json_decode($data);
var_dump($json_data);
EDIT
Here is the code that currently works, but causes the page to load too slowly
$ch = curl_init();
curl_setopt($ch, CURLOPT_CAINFO, 'cacert.pem');
curl_setopt($ch, CURLOPT_USERPWD, "$username:$password");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
while($row = $result->fetch_object()){
curl_setopt($ch, CURLOPT_URL, 'https://mysite.com/search/'.$row->username.'/');
$data = curl_exec($ch);
$json_data = json_decode($data);
var_dump($json_data);
}
I'm wondering:
$i = 0;
while ($row = $result->fetch_object()) {
$data = curl_multi_getcontent($ch[$i]);
$json_data = json_decode($data);
var_dump($json_data);
Are you forgetting to increment $i? If so, you already grabbed the content for $ch[0], and then you call curl_multi_getcontent again.
Also, I've written a blog post covering concurrent requests with PHP's cURL extension, and it contains a general function for curl multi requests. You could call this function in the following manner:
$responses = multi([
$requests = [
['url' => 'https://example.com/search/username1/'],
['url' => 'https://example.com/search/username2/'],
['url' => 'https://example.com/search/username3/']
]
$opts = [
CURLOPT_CAINFO => 'cacert.pem',
CURLOPT_USERPWD => "username:password"
]
]);
Then, you cycle through the responses array:
foreach ($responses as $response) {
if ($response['error']) {
// handle error
continue;
}
// check for empty response
if ($response['data'] === null) {
// examine $response['info']
continue;
}
// handle data
$data = json_decode($response['data']);
// do something
}
Using this function, you could make a simple test of accessing https sites using the following call:
multi(
$requests = [
'google' => ['url' => 'https://www.google.com'],
'linkedin' => ['url'=> 'https://www.linkedin.com/']
],
$opts = [
CURLOPT_CAINFO => '/path/to/your/cacert.pem',
CURLOPT_SSL_VERIFYPEER => true
]
);
I see that your execution loop is different from the one that is adviced in PHP documentation:
do {
$mrc = curl_multi_exec($mh, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);
Note that in while
the function return is compared, not the second parameter.
Edit: Thanks to Adam's comment I have tested both syntaxes and see that they are equal and asynchronous. Here is a working example of asynchronous multi-request with getting content into variable:
<?php
$ch = array();
$mh = curl_multi_init();
$total = 100;
echo 'Start: ' . microtime(true) . "\n";
for ($i = 0; $i < $total; $i++) {
$ch[$i] = curl_init();
curl_setopt($ch[$i], CURLOPT_URL, 'http://localhost/sleep.php?t=' . $i);
curl_setopt($ch[$i], CURLOPT_HEADER, 0);
curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($mh, $ch[$i]);
}
$active = null;
do {
$mrc = curl_multi_exec($mh, $active);
usleep(100); // Maybe needed to limit CPU load (See P.S.)
} while ($active);
foreach ($ch AS $i => $c) {
$r = curl_multi_getcontent($c);
var_dump($r);
curl_multi_remove_handle($mh, $c);
}
curl_multi_close($mh);
echo 'End: ' . microtime(true) . "\n";
And testing file sleep.php:
<?php
$start = microtime(true);
sleep( rand(3, 5) );
$end = microtime(true);
echo $_GET['t'], ': ', $start, ' - ', $end, ' - ', ($end - $start);
echo "\n";
P.S. Initial idea of using usleep
inside a loop was to pause it a bit and thus reduce number of operations while cUrl waits for response. And at the beginning it seemed to work that way. But last tests with top
showed a minimal difference in CPU load (17% with usleep
versus 20% without it). So, I do not know whether to use it or not. Maybe tests on real server would show another results.
Edit 2: I have tested my code with making a request to password protected HTTPS page (CURLOPT_CAINFO
and CURLOPT_USERPWD
equal to those in the question). It works as expected. Probably there is a bug in your version of PHP or cURL. My versions are "PHP Version 5.3.10-1ubuntu3.8" and 7.22.0. They have no problems.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With