I'm using Guzzle in my Laravel project. I had a memory crash when I make a request to an API that return a huge payload.
I have this on the top of my CURL.php
class. I have get() that I use Guzzle.
use GuzzleHttp\Exception\GuzzleException;
use GuzzleHttp\Client;
use GuzzleHttp\FORCE_IP_RESOLVE;
use GuzzleHttp\DECODE_CONTENT;
use GuzzleHttp\CONNECT_TIMEOUT;
use GuzzleHttp\READ_TIMEOUT;
use GuzzleHttp\TIMEOUT;
class CURL {
public static function get($url) {
$client = new Client();
$options = [
'http_errors' => true,
'force_ip_resolve' => 'v4',
'connect_timeout' => 2,
'read_timeout' => 2,
'timeout' => 2,
];
$result = $client->request('GET',$url,$options);
$result = (string) $result->getBody();
$result = json_decode($result, true);
return $result;
}
...
}
When I call it like this in my application, it request a large payload (30000)
$url = 'http://site/api/account/30000';
$response = CURL::get($url)['data'];
I kept getting this error
cURL error 28: Operation timed out after 2000 milliseconds with 7276200 out of 23000995 bytes received (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
How do I avoid this?
Should I increase these settings?
'connect_timeout' => 2,
'read_timeout' => 2,
'timeout' => 2,
Yes, you need to increase read_timeout
and timeout
. The error is clear, you don't have enough time to get the response (the server is slow, network or something else, doesn't matter).
If it's possible, increasing the timeouts is the easiest way.
If the server supports pagination, it's a better way to request the data part by part.
Also you can use async queries in Guzzle and send something to your end user while you are waiting for the response from the API.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With