Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Letting Users Upload Huge Files to Website

So I need a little bit of advice. I'm building a website for academia purposes only, that is restricted to a very select number of people. The only purpose of this website is as a GUI for file uploads. The size of these files typically range between 10-12GB.

By doing reasearch, I found that I can use php scripts to upload the file and change the max file upload size in php.ini (but I couldn't find a max file upload size limmit). There are a couple of questions I have.

1) Do I need to somehow figure out a way to maintain a connection open between the website and the user in order to avoid things such as connection timeouts? And if so, is it enough to do this on the server side? Or is it also an issue with the web browsers these usesr will be using (does the web browser timeout your connection)? --- I'm asking this because these uploads will take a huge ammount of time.

2) What are some of the security issues I have to take into mind? Thus far I've found and considered the following:

  1. restricting accesss to the website only from a number of subnets (my desired academia users)

  2. the files that are going to be uploaded to the website are a special format, consisting of unique headers) -- and thus checking for these headers.

  3. disable CGI exectuions using .htaccess

  4. move all uploads outside of the www root folder

  5. potentially finding a anti virus the check these files

  6. Initially all these users will authenticate in order to access this website -- this is something that I also have to take into consideration with the first question -- how long can they stay logged on and how can I control that?

The security aspect of this website is crucial to its development which is a very tricky issue when you're working with 12 GB files.

These are some of the things I've thought would be difficult to achieve but I'm sure there are more. What do you guys think I should also take into consideration? Also if there are other methods of implementation, please don't hesitate.

Aditional information:

  • As of now, these users use scp in order to upload files to this server
  • These users are highly trusted by my comunity but nevertheless, security is the number one motto of this community -- very few things are open to the public and some of the information stored on these servers can not be comprimised
  • The files that are going to be uploaded to the server are basically confidential data regarding real people -- so I must consider things such as packet sniffing and such

Thanks guys, I know this is a lot to swallow but any help would really be appreciated.

like image 794
Florin Stingaciu Avatar asked Jun 13 '12 14:06

Florin Stingaciu


People also ask

How do I allow someone to upload files to my website?

If you want to allow a user to upload an external file to your website, you need to use a file upload box, also known as a file select box. This is also created using the <input> element but type attribute is set to file.

What danger is there in allowing uploads over the web?

Upload forms on web pages can be dangerous because they allow attackers to upload malicious code to the web server. Attackers can then use tricks to execute this code and access sensitive information or even take control of the server.

How would you handle the situation where a user uploads a very large file through a form in your Spring MVC application?

A new attribute "maxSwallowSize" is the key to deal this situation. It should happen when you upload a file which is larger than 2M. Because the 2M is the default value of this new attribute .


3 Answers

Not knowing your users, I would recommend sticking with SCP. All of your security concerns have been addressed with SCP, and it is widely and freely available.

That being said, I understand that if you're users are not technical, SCP is painful. If I were you, I would not try to implement this entirely as a web application. I would have a web front end that would simply generate the appropriate SCP command for the user, and execute it for them. This should be easy to do with a Java applet, and you could provide rough progress through the browser by simply translating the SCP progress that it feeds back to your process into Java. If you go the applet route you'll run into some security issues because applets can't usually start processes on the client's PC. To get around this you'll need to sign the applet and/or request higher level privileges. Again this may or may not be a good solution depending on how technical your users are.

If you insist on going the web app route, make sure you use SSL/TLS for the transfer. That takes care of your packet sniffing concerns. Also make sure that the executable bit is not set on any of the huge files. If they need to be executable, insist that the user set the executable bit at download time and not while it is being stored. If it must be executed on the server, use the built in file system permission to sandbox the app. Heaven help you if they have to upload and execute on the server as root. If that's the case then security is out of the question :-)

Having anti-virus to check the files is good, but be advised that the detection rates are terrible. I've seen cases where a file was deeply infected and was pronounced clean by 30 out of 35 AV programs. Assume that all files are infected and restrict their permissions appropriately.

A real risk is that someone may upload code that contains an exploit. As long as the files are not executable it is harder. If there is no access to a compiler then it is harder. If you restrict the ways that the files can be accessed to only certain programs then it is even harder for the exploiter. The single best thing you can do though to protect against this is to install your patches. Updates keep you safe, even though they sometimes break stuff.

If the users are trustworthy, you may allow them to set the permissions themselves for who can read/write the files. This way the individual's sensitive information is controlled, and there is some accountability if someone sets the permissions to loosely.

This is a pretty big problem to solve, and I know I just touched the surface. I hope it at least helps a little though.

like image 41
Freedom_Ben Avatar answered Sep 30 '22 18:09

Freedom_Ben


Building such a website efficiently is really quite tricky as you say. However you might want to consider using a GUI program built in Java or C# as a more suitable/secure approach. The reason and advantages are numerous.

First, it is very recommended you split on the client side before upload. Your users can easily use powerful utilities made for this like the FFSJ Command Line Interface (for Windows) or the JJSplit (Java Open Source).

The advantages of using such a program are as follows:

  1. Better and more secure of authentication of user
  2. Validate user IP subnet on both client side and server side
  3. You get the option of splitting the files for ease of transfer
  4. You have the advantage to upload the multiple file parts simultaneously
  5. You have the ability to use the user's Internet connection maximum speed
  6. You have the ability to ensure that minimize as well as correct errors that could occur during the upload.
  7. Checksum verification after upload to ensure authenticity and correct upload
  8. More added security advantage since you the upload can even be encrypted.

I could go on, but you get the gist. However, the obvious disadvantage of this option is the financial cost and time if you are not a programmer.

However one such program that already has the basic ability to split and upload is GSplit. On Unix based systems they could also split the files themselves before upload or use programs such as HJSplit.

ps:

If you have to stick with PHP all through, check out Open Upload. It is similar to a RapidShare clone and already has ip restriction, user authentication, captchas, etc.

I hope that exhaustively answers your question. Good luck.

like image 33
Chibueze Opata Avatar answered Sep 30 '22 17:09

Chibueze Opata


It's impossible for you to upload a single file that large via PHP. Doesn't matter how much server resources (memory) you throw at it (and trust me, you'd need a lot!), there is one limitation in PHP that will prevent you from doing so.

In php.ini, there is a variable called "post_max_size" which affects the maximum size of data you can send to the server. This variable is an unsigned 32bit integer which means you can't go higher than 4,294,967,295.

A good solution would be to use Flash to split the file into several small parts, and then conduct the file upload. I found a library for you that does exactly this: http://www.blog.elimak.com/2011/08/uploading-large-files-with-flash-splitting-the-bytearray-in-chunks/

If you run into memory issues in PHP, just make the chunks smaller.

Good luck!

EDIT: To answer your second question, you've got most of it there. Use SSL to conduct your transfer if you're afraid that data could be potentially hijacked en route. As for authentication, consider using OAuth, and make sure to revoke the user's access token upon transfer inactivity. Add php_flag engine off into your .htaccess file to prevent uploaded PHP files from running. And a server virus scanner wouldn't be a bad thing to have running periodically either :)

like image 192
psychobunny Avatar answered Sep 30 '22 16:09

psychobunny