My client wanted a way to offer downloads to users, but only after they fill out a registration form (basically name and email). An email is sent to the user with the links for the downloadable content. The links contain a registration hash unique to the package, file, and user, and they actually go to a PHP page that logs each download and pushes the file out by writing it to stdout (along with the appropriate headers. This solution has inherent flaws, but this is how they wanted to do it. It needs to be said that I pushed them hard to 1.) limit the sizes of the downloadable files and 2.) think about using a CDN (they have international customers but are hosted in the US on 2 mirrored servers and a load balancer that uses sticky IPs). Anyway, it "works for me" but some of their international customers are on really slow connections (d/l rates of ~60kB/sec) and some of these files are pretty big (150 MB). Since this is a PHP script that is serving these files, it is bound by the script timeout setting. At first I had set this to 300 seconds (5 minutes), but this was not enough time for some of the beta users. So then I tried calculating the script timeout based on the size of the file divided by a 100kb/sec connection, but some of these users are even slower than that.
Now the client wants to just up the timeout value. I don't want to remove the timeout all together in case the script somehow gets into an infinite loop. I also don't want to keep pushing out the timeout arbitrarily for some catch-all lowest-common-denominator connection rate (most people are downloading much faster than 100kb/sec). And I also want to be able to tell the client at some point "Look, these files are too big to process this way. You are affecting the performance of the rest of the website with these 40-plus minute connections. We either need to rethink how they are delivered or use much smaller files."
I have a couple of solutions in mind, which are as follows:
Chances are #'s 1-3 will be declined or at least pushed off. So is 4 a good way to go about this, or is there something else I haven't considered?
(Feel free to challenge the original solution.)
Use X-SENDFILE. Most webservers will support it either natively, or though a plugin (apache).
using this header you can simply specify a local file path and exit the PHP script. The webserver sees the header and serves that file instead.
The easy solution would be to disable the timeout. You can do this on a per-request basis with:
set_time_limit(0);
If your script is not buggy, this shouldn't be problem – unless your server is not able to handle so many concurrent connections due to slow clients.
In that case, #1, #2 and #3 are two good solutions, and I would go with whichever is cheaper. Your concerns about #1 could be mitigated by generating download tokens that could only be used once, or for a small period of time.
Option #4, in my opinion, is not a great option. The speed can greatly vary during a download, so any estimate you would do initially would be, with a significant probability, wrong.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With