Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP closing a connection early. script hangs if any output is done

Tags:

php

I'm early closing a connection to client with this:

static public function early_close( $output )
{
   ignore_user_abort(true);
   echo $output;

   // Disable gzip compression in apache, as it can result in this request being buffered until it is complete,
   // regardless of other settings.
   if (function_exists('apache_setenv')) {
       apache_setenv('no-gzip', 1);
   }

    // get the size of the output
    $size = ob_get_length();

    // send headers to tell the browser to close the connection
    header("Content-Length: $size");
    header('Connection: close');
    header("Content-Encoding: none"); // To disable Apache compressing anything

    // IF PHP-FM
    // fastcgi_finish_request();

    // flush all output
    if( ob_get_level() > 0 )
    {
        ob_end_flush();
        ob_get_level()? ob_flush():null;
        flush();
    }

    // if you're using sessions, this prevents subsequent requests
    // from hanging while the background process executes
    if( session_id() )
    {
        session_write_close();
    }
}

Works ok, but after this event if some script outputs anything (either by echo'ing or by adding a new header) the script stops executing from that point.
I've tried to start output buffering after early closing and then discarding it but it does not work:

Server::early_close();
ob_start();
heavy_work();
ob_clean();

Any ideas?
Using php 5.3.x

like image 813
Pherrymason Avatar asked Aug 23 '13 11:08

Pherrymason


3 Answers

The classic code to do that is:

ob_end_clean();
header("Connection: close");
ignore_user_abort();              // optional ob_start();

echo ('Text the user will see');

$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();                   // Strange behaviour, will not work
flush();                          // Unless both are called !

// Do processing here
sleep(30);

echo('Text user will never see');

Otherwise, I advise the following read if you want to do asynchronous calls: Methods for asynchronous processes in PHP

like image 195
Toto Avatar answered Oct 15 '22 22:10

Toto


You need an echo chr(0); after the echo $output;. Sending a null byte will force the browser to end the connection. Also, I'm assuming there is an ob_start() before the Server::early_close()? If not, you would need that for ob_get_length to work properly.

like image 1
Wing Lian Avatar answered Oct 16 '22 00:10

Wing Lian


IMHO you shouldn't take this route. Http requests should be as short as possible to improve usability.

If some "heavy processing" should be performed, you can "schedule" it using some sort of queue. A separate process/daemon on the server can pick up these jobs from the queue to perform them. The http application can then check if such a job is still waiting to be processed / has been started / is done.

There are many libraries available to facilitate this: Gearman, ØMQ, RabbitMQ, etc.

Http requests aren't really suited for long operations, which is why you run into all sorts of problems when trying to do so :)

UPDATE

If you're unable to use libraries (like Gearman, etc) on the server, you could build your own file- or db-based queue, push "commands" into the queue from within your application, and have a cronjob read that queue and perform those tasks.

like image 1
Jasper N. Brouwer Avatar answered Oct 15 '22 23:10

Jasper N. Brouwer