Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Configuring Apache for Large Uploads

I am developing a file uploading service for my company. Our users are often sending us very large .zip files filled with very large illustrator files. Generally the files won't be larger than 1.5GB, but I need to plan on handling files up to 4GB.

Obviously, this causes a lot of concerns around how I configure my Apache to allow such large file transfers without overloading my server or opening up security holes.

Concern 1: Memory Limits

One feature of my system is that users should be able to download their files back after they have uploaded them. I've avoided using a standard download (just link to the file) because of security issues - I can't let other user's download each others files. My solution was to keep the uploaded files in a protected directory outside of the www-root, and load them in via a PHP script. That PHP script looks a little like this:

$fo = fopen($uploadDir.$file_name, "r");
while(!feof($fo)) {
        $strang = fread($fo, 102400);
        echo $strang;
        ob_flush();
}
fclose($fo);

I've put the fread into a loop and locked it into only loading small chunks of the file at a time. I did that because my server only has 4GB of RAM, and needs to be able to handle multiple people downloading concurrently (probably a max of 20 ppl). My first question is how large of a chunk should I read at a time. Right now downloading a 1.5GB file is painfully slow because of that chunk size.

Concern 2: Max Execution Time

Another issue of uploading/downloading such large files is max execution time. While everything is pretty speed on my internal network, I should be prepared for users with slow connections. I'm setting my lower-end upload rate at 1Mbps, which comes to about 35 minutes for uploading.. Let's be generous and say that they can download at twice that speed, so 15ish minutes to download.

Is it a security risk to set my max_excution_time to 30 minutes? Is it going to kill anything on my server? I don't have any reason to think it's a bad idea, but my gut is just screaming that I would be stupid to allow a script to run that long. Is there a better way to do what I'm trying to do?


I've seen a couple other similar questions, and most of them suggested using something like java or silverlight. I would like to, if there is any sensible way, to avoid java. I am currently using swfupload and the accompanying jQuery plugin.

like image 276
jwegner Avatar asked Jul 21 '11 14:07

jwegner


1 Answers

Concern 1: Memory Limits

readfile streams, so this is converted into the HTTP response on the fly and it won't be loaded into memory completed.

For uploads however, the script must have enough memory because while uploading and using PHP the whole file goes into memory if I remember right.

Concern 2: Max Execution Time

If you're concerned about security, handle authentication and access rights already via the HTTPD server. Then your script won't be executed when the request is not valid.

like image 190
hakre Avatar answered Sep 30 '22 10:09

hakre