I have a php-script thats allows user to download large files with download resume.
This works fine but I have limited the downloadspeed with php-code to 200kb/s per connection.
So, Mozillas Firefox download manager makes only one connection and the speed is around 200kb/s but Free-Download-Manager or JDownloader makes 2 to 4 connections and so, then the downloadspeed is (200Kb/s * 2 or 4) = 400 to 800kb/s.
How can I stop this and allow only one connection for every user to download this file?
This maximum number of connections applies to any connection to a web server, not just to downloads. This limit of two connections is set by the HTTP 1.1 specification (RFC2068) rather than Microsoft.
Max user connections HostGator allows a maximum of 25 simultaneous MySQL connections per cPanel. If you exceed this limit, you will receive a max_user_connections error. Typically, either your site has too much traffic for the shared server, or your site may be under attack.
The default number of simultaneous downloads from a web server is only 2. By tweaking the registry you can increase the number to 10. MaxConnectionsPerServer is a property of Internet Explorer, rather than of Vista. Default values simultaneous downloads have increased in later version of IE, thus in IE 7 the default is 2, whereas IE it’s 6.
In MySQL, there is a limit on the maximum number of simultaneous connections to the database server. If maximum connections from the same user to the database are more than defined database max_user_connections. The solution is to increase max_user_connections at the database server.
A. I think the first thing for you is to disable Content-Range
..
14.16 Content-Range
The Content-Range entity-header is sent with a partial entity-body to specify where in the full entity-body the partial body should be applied. Range units are defined in section 3.12.
Download manager can download a single fine in 2 or multiple connections because of range .. if you disable this both download resume
or multiple connections
can not be made on a single file. they would make every request to the file start from the beginning
Example
LoadModule headers_module modules/mod_headers.so
Header set Accept-Ranges none
RequestHeader unset Range
You should also look at 14.35.1 Byte Ranges
B. Introduce Download Sessions .
You can generate a uniqid id
for each download and serve it via PHP page. If the download is still active or has been requested before you just exist the page
Example
$realFile = "test.pdf";
$fakeFile = uniqid("file");
$uniqid = isset($_REQUEST['id']) ? $_REQUEST['id'] : null;
if (empty($uniqid) || strlen($uniqid) < 20 || !ctype_xdigit($uniqid)) {
die("Die! Die! Die! Stolen URL");
}
$memcache = new \Memcache();
$memcache->connect('localhost', 11211);
$runtime = (int) $memcache->get($uniqid);
if ($runtime) {
die("Die! Die! Die! You Multiple Down loader");
} else {
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT\n");
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
header("Content-Transfer-Encoding: binary");
header("Content-disposition: attachment; filename=$fakeFile.pdf"); //
header('Content-type: application/pdf');
header("Content-length: " . filesize($realFile));
readfile($realFile);
$memcache->set($uniqid, 1);
}
Simple Client
$url = "a.php?id=" . bin2hex(mcrypt_create_iv(30, MCRYPT_DEV_URANDOM));
printf("<a href='%s'>Download Here</a>",$url);
It would output something like
<a href='a.php?id=aed621be9d43b0349fcc0b942e84216bf5cd34bcae9b0e33b9d913cccd6e'>Download Here</a>
You also need map each id to a particular file ...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With