Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Paperclip, large file uploads, and AWS

So, I'm using Paperclip and AWS-S3, which is awesome. And it works great. Just one problem, though: I need to upload really large files. As in over 50 Megabytes. And so, nginx dies. So apparently Paperclip stores things to disk before going to S3?

I found this really cool article, but it also seems to be going to disk first, and then doing everything else in the background.

Ideally, I'd be able to upload the file in the background... I have a small amount of experience doing this with PHP, but nothing with Rails as of yet. Could anyone point me in a general direction, even?

like image 277
Steve Klabnik Avatar asked Aug 11 '09 20:08

Steve Klabnik


1 Answers

You can bypass the server entirely and upload directly to S3 which will prevent the timeout. The same thing happens on Heroku. If you are using Rails 3, please check out my sample projects:

Sample project using Rails 3, Flash and MooTools-based FancyUploader to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-FancyUploader

Sample project using Rails 3, Flash/Silverlight/GoogleGears/BrowserPlus and jQuery-based Plupload to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-Plupload

By the way, you can do post-processing with Paperclip using something like this blog post (that Nico wrote) describes:

http://www.railstoolkit.com/posts/fancyupload-amazon-s3-uploader-with-paperclip

like image 82
iwasrobbed Avatar answered Oct 18 '22 12:10

iwasrobbed