Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

php, file download

I am using the simple file downloading script:

if (file_exists($file)) {
    header('Content-Description: File Transfer');
    header('Content-Type: application/octet-stream');
    header('Content-Disposition: attachment; filename='.basename($file));
    header('Content-Transfer-Encoding: binary');
    header('Expires: 0');
    header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
    header('Pragma: public');
    header('Content-Length: ' . filesize($file));
    ob_clean();
    flush();
    readfile($file);
    exit;
}

It is working on my localserver upto 200mb.

When i try this code in my website it downloads 173KB instead of 200MB file.

I checked everything, wrote some custom code (using ob functions and fread instead of readfile) but can't download big files.

Thank you for your answers.

  • I am using Apache 2.2, PHP 5.3
  • All PHP settings to deal with big files are ok. (execution times, memory limits, ...
like image 264
jsonx Avatar asked Apr 08 '11 13:04

jsonx


3 Answers

One issue I have with the following code is you have no control over the output stream, your letting PHP handle it without knowing exactly what is going on within the background:

What you should do is set up an output system that you can control and replicated accros servers.

For example:

if (file_exists($file))
{
    if (FALSE!== ($handler = fopen($file, 'r')))
    {
        header('Content-Description: File Transfer');
        header('Content-Type: application/octet-stream');
        header('Content-Disposition: attachment; filename='.basename($file));
        header('Content-Transfer-Encoding: chunked'); //changed to chunked
        header('Expires: 0');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header('Pragma: public');
        //header('Content-Length: ' . filesize($file)); //Remove

        //Send the content in chunks
        while(false !== ($chunk = fread($handler,4096)))
        {
            echo $chunk;
        }
    }
    exit;
}
echo "<h1>Content error</h1><p>The file does not exist!</p>";

This is only basic but give it a go!

Also read my reply here: file_get_contents => PHP Fatal error: Allowed memory exhausted

like image 52
RobertPitt Avatar answered Oct 22 '22 10:10

RobertPitt


It seems readfile can have issues with long files. As @Khez asked, it could be that the script is running for too long. A quick Googling resulted in a couple examples of chunking the file.

http://teddy.fr/blog/how-serve-big-files-through-php http://www.php.net/manual/en/function.readfile.php#99406

like image 3
Paul DelRe Avatar answered Oct 22 '22 12:10

Paul DelRe


One solution to certain scenarios is that you can use PHP-script to intelligently decide what file from where to download, but instead of sending the file directly from PHP, you could return a redirection to the client which then contains the direct link which is processed by the web server alone.

This could be done at least in two ways: either PHP-script copies the file into a "download zone" which for example might be cleaned from "old" files regularly by some other background/service script or you expose the real permanent location to the clients.

There are of course drawbacks as is the case with each solution. In this one is that depending on the clients (curl, wget, GUI browser) requesting the file they may not support redirection you make and in the other one, the files are very exposed to the outer world and can be at all times read without the (access) control of the PHP script.

like image 3
Polarlight Avatar answered Oct 22 '22 11:10

Polarlight