We're building an API that performs repetitive file_get_contents.
I have an array of userids and the number of file_get_contents() will be repeated by the number of contents in the array. We will do thousands of requests.
function request($userid) {
global $access_token;
$url = 'https://<api-domain>'.$userid.'?access_token='.$access_token;
$response = file_get_contents($url);
return json_decode($response);
}
function loopUserIds($arrayUserIds) {
global $countFollowers;
$arrayAllUserIds = array();
foreach ($arrayUserIds as $userid) {
$followers = request($userid);
...
}
...
}
My concern is it takes time to get everything. Since the function will also be called in a loop. Please advise how we can make this (many file_get_contents() requests) run faster?
As @HankyPanky mentioned, you can use curl_multi_exec() to do many concurrent requests at the same time.
Something like this should help:
function fetchAndProcessUrls(array $urls, callable $f) {
$multi = curl_multi_init();
$reqs = [];
foreach ($urls as $url) {
$req = curl_init();
curl_setopt($req, CURLOPT_URL, $url);
curl_setopt($req, CURLOPT_HEADER, 0);
curl_multi_add_handle($multi, $req);
$reqs[] = $req;
}
// While we're still active, execute curl
$active = null;
// Execute the handles
do {
$mrc = curl_multi_exec($multi, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);
while ($active && $mrc == CURLM_OK) {
if (curl_multi_select($multi) != -1) {
do {
$mrc = curl_multi_exec($multi, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);
}
}
// Close the handles
foreach ($reqs as $req) {
$f(curl_multi_getcontent($req));
curl_multi_remove_handle($multi, $req);
}
curl_multi_close($multi);
}
You can use it like so:
$urlArray = [ 'http://www.example.com/' , 'http://www.example.com/', ... ];
fetchAndProcessUrls($urlArray, function($requestData) {
/* do stuff here */
// e.g.
$jsonData = json_decode($requestData, 1); //
});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With