Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best Ruby on Rails Architecture for Image Heavy App

I'm building an application that allows for large amounts of photo uploads at once, and wanted to know what the best setup would be to tackle this.

This is what I am using so far:

  • Jquery File Upload: allows users to drag and drop images
  • CarrierWave: Processes images and resizes them with ImageMagick
  • Amazon S3: CarrierWave uploads images to Amazon S3 through Fog
  • Heroku: for hosting

I'd like to allow users to be able to drag and drop a large amount of images onto a page, and then navigate to other pages while the upload is going on in the background. I'd also like pictures to appear as they finish uploading. I don't want this process to lock up the Heroku dynos, so I probably need to move the work to a background job but I'm not sure what to use for my situation.

What's the best setup for this type of app? What background worker gem should I use? Is Cloudinary a good idea?

like image 651
Jonathan Sutherland Avatar asked Apr 30 '13 19:04

Jonathan Sutherland


1 Answers

I recently built an application which accepts a large number of uploads on Heroku. I decided to build my own solution instead of using cloudinary or an equivalent. Here are some lessons I learned:

  • Don't upload to heroku. Your entire web worker will be locked down for the entire duration of the upload. That's up to a minute. Unacceptable.

  • Use a javascript uploader (like jquery-file-upload) to upload directly to s3. This is a little complicated at first, but once you get it working it's fantastic. You can use the s3_direct_upload gem, or you can just read their source to make your own solution from scratch. That gem was based on a railscasts pro episode, which you have to pay for, but has source available.

  • When the upload finishes, make an ajax request to your application passing the new s3 url as a remote url. Carrierwave will then process the image on s3 like it was uploaded, except in only a couple seconds instead of up to a minute.

  • Use jquery-file-upload's client-side image resizing. Somebody's going to try to upload a 5MB photo and then bitch that the upload takes forever. This will make all uploads as fast as theoretically possible.

  • Configure s3 to clear your uploads folder automatically.

  • Don't use thin. Use unicorn. A couple seconds is too long to be processing a request on thin, but unicorn with three or four workers is much more forgiving.

  • Don't use rmagick. It's a better API for complex image manipulation but uses amazing amounts of memory. Use mini_magick instead.

You'll note that I'm not using a background worker for any of this. If you're really feeling meticulous, you could have the controller that receives the remote url pass its work to a background worker, and if you need the result immediately the background worker could notify the UI by pubsub (faye or pusher, possibly with the exciting new sync gem). But this wasn't necessary for my application, and I'd rather spend my money on another web dyno than a worker dyno.

And, yeah, if you want to let them click around your whole application while that's happening, you're going to need to either be uploading in a popup (and using some kind of pubsub solution), or building your whole site as a javascript application using ember or backbone or angular or whatever.

Any questions?

like image 91
Taavo Avatar answered Oct 20 '22 11:10

Taavo