Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to make asynchronous HTTP requests in PHP

People also ask

Which method is used to make an asynchronous HTTP request?

Ajax. Ajax is the traditional way to make an asynchronous HTTP request. Data can be sent using the HTTP POST method and received using the HTTP GET method.

Does PHP support asynchronous?

PHP has no built in support for asynchronous calls. You can make pseudo-asynchronous calls using curl.

What are asynchronous HTTP requests?

Asynchronous HTTP Request Processing is a relatively new technique that allows you to process a single HTTP request using non-blocking I/O and, if desired in separate threads. Some refer to it as COMET capabilities.

Is PHP Curl asynchronous?

Short answer is no it isn't asynchronous. Longer answer is "Not unless you wrote the backend yourself to do so." If you're using XHR, each request is going to have a different worker thread on the backend which means no request should block any other, barring hitting process and memory limits.


The answer I'd previously accepted didn't work. It still waited for responses. This does work though, taken from How do I make an asynchronous GET request in PHP?

function post_without_wait($url, $params)
{
    foreach ($params as $key => &$val) {
      if (is_array($val)) $val = implode(',', $val);
        $post_params[] = $key.'='.urlencode($val);
    }
    $post_string = implode('&', $post_params);

    $parts=parse_url($url);

    $fp = fsockopen($parts['host'],
        isset($parts['port'])?$parts['port']:80,
        $errno, $errstr, 30);

    $out = "POST ".$parts['path']." HTTP/1.1\r\n";
    $out.= "Host: ".$parts['host']."\r\n";
    $out.= "Content-Type: application/x-www-form-urlencoded\r\n";
    $out.= "Content-Length: ".strlen($post_string)."\r\n";
    $out.= "Connection: Close\r\n\r\n";
    if (isset($post_string)) $out.= $post_string;

    fwrite($fp, $out);
    fclose($fp);
}

If you control the target that you want to call asynchronously (e.g. your own "longtask.php"), you can close the connection from that end, and both scripts will run in parallel. It works like this:

  1. quick.php opens longtask.php via cURL (no magic here)
  2. longtask.php closes the connection and continues (magic!)
  3. cURL returns to quick.php when the connection is closed
  4. Both tasks continue in parallel

I have tried this, and it works just fine. But quick.php won't know anything about how longtask.php is doing, unless you create some means of communication between the processes.

Try this code in longtask.php, before you do anything else. It will close the connection, but still continue to run (and suppress any output):

while(ob_get_level()) ob_end_clean();
header('Connection: close');
ignore_user_abort();
ob_start();
echo('Connection Closed');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();

The code is copied from the PHP manual's user contributed notes and somewhat improved.


You can do trickery by using exec() to invoke something that can do HTTP requests, like wget, but you must direct all output from the program to somewhere, like a file or /dev/null, otherwise the PHP process will wait for that output.

If you want to separate the process from the apache thread entirely, try something like (I'm not sure about this, but I hope you get the idea):

exec('bash -c "wget -O (url goes here) > /dev/null 2>&1 &"');

It's not a nice business, and you'll probably want something like a cron job invoking a heartbeat script which polls an actual database event queue to do real asynchronous events.


As of 2018, Guzzle has become the defacto standard library for HTTP requests, used in several modern frameworks. It's written in pure PHP and does not require installing any custom extensions.

It can do asynchronous HTTP calls very nicely, and even pool them such as when you need to make 100 HTTP calls, but don't want to run more than 5 at a time.

Concurrent request example

use GuzzleHttp\Client;
use GuzzleHttp\Promise;

$client = new Client(['base_uri' => 'http://httpbin.org/']);

// Initiate each request but do not block
$promises = [
    'image' => $client->getAsync('/image'),
    'png'   => $client->getAsync('/image/png'),
    'jpeg'  => $client->getAsync('/image/jpeg'),
    'webp'  => $client->getAsync('/image/webp')
];

// Wait on all of the requests to complete. Throws a ConnectException
// if any of the requests fail
$results = Promise\unwrap($promises);

// Wait for the requests to complete, even if some of them fail
$results = Promise\settle($promises)->wait();

// You can access each result using the key provided to the unwrap
// function.
echo $results['image']['value']->getHeader('Content-Length')[0]
echo $results['png']['value']->getHeader('Content-Length')[0]

See http://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests