I offer a zip file download with PHP like this:
header("Content-Type: application/zip");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header('Content-Description: File Transfer');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: '.filesize($location));
readfile($location);
How can I handle that when the file is located on a remote server like S3 or dropbox (where I have the rights of course)
I don't like any redirection cause the users shouldn't see the original location.
Do I have to download the file and (temporary) store it on the server?
You can (and possibly should!) store the file locally, but you do not have to.
So there are a few possible solutions here. These examples assume that $filename has either been safely generated or has been sanitized with something like:
$filename = preg_replace('/[^\w.]/', '', $filename); //sanitize
1) readfile, with allow_url_fopen enabled: (see http://www.php.net/manual/en/features.remote-files.php for further details)
readfile("http://url/to/your/$filename");
2) something more cacheingy, like:
// Serve a file from a remote server.
function serveFile($filename) {
// Folder to locally cache files. Ensure your php user has write access.
$cacheFolder = '/path/to/some/cache/folder';
// URL to the folder you'll be downloading from.
$remoteHost = 'http://remote.host/path/to/folder/';
$cachedFile = "$cacheFolder$filename";
// Cache the file if we haven't already.
if (!file_exists($cachedFile)) {
// May want to test these two calls, and log failures.
file_put_contents($cachedFile, file_get_contents("$remoteHost$filename"));
}
else {
// Set the last accessed time.
touch($cachedFile);
}
readfile($cachedFile) or die ("Well, shoot");
// Optionally, clear old files from the cache.
clearOldFiles($cacheFolder);
}
// Clear old files from cache folder, based on last mtime.
// Could also clear depending on space used, etc.
function clearOldFiles($cacheFolder) {
$maxTime = 60 * 60 * 24; // 1 day: use whatever works best.
if ($handle = opendir($cacheFolder)) {
while (false !== ($file = readdir($handle))) {
if ((time() - filemtime($path.$file)) > $maxTime) {
unlink($path.$file);
}
}
}
}
3) Use CURL, if you do not have access to enable allow_url_fopen.
4) Use an external program like wget if you do not have CURL installed and cannot install it.
5) Worst case: open a socket to port 80 on the remote server, and just send a HTTP request for the file.
6) Your web server may be able to do some kind of proxying redirect that means you don't actually need any code to accomplish this, and you would get caching and other optimizations for free. For example, see the documentation on mod_proxy for Apache here: https://httpd.apache.org/docs/2.2/mod/mod_proxy.html
Option 6 is the best if you at all can. Other than that, the first two are the most likely to be needed, but I can fill in some example code for the others if you like :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With