Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Serving large files with PHP

Tags:

php

apache

So I am trying to serve large files via a PHP script, they are not in a web accessible directory, so this is the best way I can figure to provide access to them.

The only way I could think of off the bat to serve this file is by loading it into memory (fopen, fread, ect.), setting the header data to the proper MIME type, and then just echoing the entire contents of the file.

The problem with this is, I have to load these ~700MB files into memory all at once, and keep the entire thing there till the download is finished. It would be nice if I could stream in the parts that I need as they are downloading.

Any ideas?

like image 945
Adam Avatar asked Jan 11 '09 10:01

Adam


4 Answers

You don't need to read the whole thing - just enter a loop reading it in, say, 32Kb chunks and sending it as output. Better yet, use fpassthru which does much the same thing for you....

$name = 'mybigfile.zip';
$fp = fopen($name, 'rb');

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
fpassthru($fp);
exit;

even less lines if you use readfile, which doesn't need the fopen call...

$name = 'mybigfile.zip';

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
readfile($name);
exit;

If you want to get even cuter, you can support the Content-Range header which lets clients request a particular byte range of your file. This is particularly useful for serving PDF files to Adobe Acrobat, which just requests the chunks of the file it needs to render the current page. It's a bit involved, but see this for an example.

like image 75
Paul Dixon Avatar answered Oct 09 '22 04:10

Paul Dixon


The best way to send big files with php is the X-Sendfile header. It allows the webserver to serve files much faster through zero-copy mechanisms like sendfile(2). It is supported by lighttpd and apache with a plugin.

Example:

$file = "/absolute/path/to/file"; // can be protected by .htaccess
header('X-Sendfile: '.$file);
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.basename($file).'"');
// other headers ...
exit;

The server reads the X-Sendfile header and sends out the file.

like image 39
d0k Avatar answered Oct 09 '22 03:10

d0k


While fpassthru() has been my first choice in the past, the PHP manual actually recommends* using readfile() instead, if you are just dumping the file as-is to the client.

* "If you just want to dump the contents of a file to the output buffer, without first modifying it or seeking to a particular offset, you may want to use the readfile(), which saves you the fopen() call." —PHP manual

like image 6
Henrik Paul Avatar answered Oct 09 '22 03:10

Henrik Paul


If your files are not accessible by the web server because the path is not in your web serving directory (htdocs) then you can make a symbolic link (symlink) to that folder in your web serving directory to avoid passing all traffic trough php.

You can do something like this

ln -s /home/files/big_files_folder /home/www/htdocs

Using php for serving static files is a lot slower, if you have high traffic, memory consumption will be very large and it may not handle a big number of requests.

like image 2
codeassembly Avatar answered Oct 09 '22 03:10

codeassembly