Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running concurrent PHP scripts

I'm having the following problem with my VPS server.

I have a long-running PHP script that sends big files to the browser. It does something like this:

<?php
header("Content-type: application/octet-stream");
readfile("really-big-file.zip");
exit();
?>

This basically reads the file from the server's file system and sends it to the browser. I can't just use direct links(and let Apache serve the file) because there is business logic in the application that needs to be applied.

The problem is that while such download is running, the site doesn't respond to other requests.

like image 721
Emil M Avatar asked Jan 17 '12 11:01

Emil M


3 Answers

The problem you are experiencing is related to the fact that you are using sessions. When a script has a running session, it locks the session file to prevent concurrent writes which may corrupt the session data. This means that multiple requests from the same client - using the same session ID - will not be executed concurrently, they will be queued and can only execute one at a time.

Multiple users will not experience this issue, as they will use different session IDs. This does not mean that you don't have a problem, because you may conceivably want to access the site whilst a file is downloading, or set multiple files downloading at once.

The solution is actually very simple: call session_write_close() before you start to output the file. This will close the session file, release the lock and allow further concurrent requests to execute.

like image 122
DaveRandom Avatar answered Nov 19 '22 17:11

DaveRandom


Your server setup is probably not the only place you should be checking.

Try doing a request from your browser as usual and then do another from some other client.

Either wget from the same machine or another browser on a different machine.

like image 42
zaf Avatar answered Nov 19 '22 16:11

zaf


In what way doesn't the server respond to other requests? Is it "Waiting for example.com..." or does it give an error of any kind?

I do something similar, but I serve the file chunked, which gives the file system a break while the client accepts and downloads a chunk, which is better than offering up the entire thing at once, which is pretty demanding on the file system and the entire server.

EDIT: While not the answer to this question, asker asked about reading a file chunked. Here's the function that I use. Supply it the full path to the file.

function readfile_chunked($file_path, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$chunksize = 1 * (1024 * 1024); // 1 = 1MB chunk size
$handle = fopen($file_path, 'rb');
if ($handle === false) {
    return false;
}
while (!feof($handle)) {
    $buffer = fread($handle, $chunksize);
    echo $buffer;
    ob_flush();
    flush();
    if ($retbytes) {
        $cnt += strlen($buffer);
    }
}
    $status = fclose($handle);
    if ($retbytes && $status) {
        return $cnt; // return num. bytes delivered like readfile() does.
}
    return $status;
}
like image 1
i-CONICA Avatar answered Nov 19 '22 16:11

i-CONICA