Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

rails carrierwave private files on S3 and cloudfront

I have both public and private files which I server from Amazon cloudfront, the public files work fine but now I'd like to secure some of them as private with an authenticated read.

The private files have their own Uploader DocumentUploader, do the files need to be stored in separate buckets? As it is now they are all in the one bucket.

I've done something similar with Paperclip awhile back but can't seem to find a good resource for doing it with Carrierwave and using a timed Authenticated_url

I see they have something like it here:

http://www.rdoc.info/github/jnicklas/carrierwave/5d1cb7e6a4e8a4786c2b/CarrierWave/Storage/Fog/File#authenticated_url-instance_method

But I'm not sure how to implement it.

Any tips would be greatly appreciated.

like image 690
ere Avatar asked Sep 10 '13 08:09

ere


3 Answers

Depends how secure, but you can set file permissions on the particular Uploader Class itself overriding the default permissions like so:

class SomeUploader < CarrierWave::Uploader::Base

  def fog_public
    false
  end

  def fog_authenticated_url_expiration
    5.minutes # in seconds from now,  (default is 10.minutes)
  end
  .....

That will automatically cause the files from this Uploader to now be prepended with the temporary AWS expiration and accesskeys and future uploads will be set to private, ie not publicly accessible.

https://s3.amazonaws.com/uploads/something/1234/124.pdf?AWSAccessKeyId=AKIAJKOSTQ6UXXLEWIUQ&Signature=4yM%2FF%2F5TV6t4b1IIvjseenRrb%2FY%3D&Expires=1379152321

like image 165
holden Avatar answered Nov 03 '22 10:11

holden


As far as I can see here you may need to create another bucket for secured files.

You can implement the security for your 'private' files by your own, in your model (if you have one) you can add a field that checks if the file is secure or not, then you can manage this scenario using your controller.

One nice gem that you can use is cancan. With it you can manage the model and some attributes (the secure field) and provide authorization or not, based on your user's profile.

like image 32
Rodrigo Oliveira Avatar answered Nov 03 '22 11:11

Rodrigo Oliveira


You can setup carrierwave config in separate uploader. like this.

using gem 'aws-sdk', '~> 2.10' gem 'carrierwave-aws', '~> 1.1'

    class BusinessDocumentUploader < CarrierWave::Uploader::Base

    def initialize(*)
      super



      CarrierWave.configure do |config|
      config.storage    = :aws
      config.aws_bucket = Rails.application.secrets.aws_bucket
      config.aws_acl    = 'private'

    #acl: "private", # accepts private, public-read, public-read-write, authenticated-read, aws-exec-read, bucket-owner-read, bucket-owner-full-control
      # Optionally define an asset host for configurations that are fronted by a
      # content host, such as CloudFront.
      config.asset_host = Rails.application.secrets.aws_asset_host

      # The maximum period for authenticated_urls is only 7 days.
      config.aws_authenticated_url_expiration = 60 * 60 * 24 * 7
      # config.aws_authenticated_url_expiration = 2

      # Set custom options such as cache control to leverage browser caching
      config.aws_attributes = {
        expires: 1.week.from_now.httpdate,
        cache_control: 'max-age=604800'
      }

      config.aws_credentials = {
        access_key_id:     Rails.application.secrets.aws_access_key_id,
        secret_access_key: Rails.application.secrets.aws_secret_access_key,
        region:            Rails.application.secrets.aws_region # Required
      }

    end

    end
    end
like image 1
Marcelo Austria Avatar answered Nov 03 '22 10:11

Marcelo Austria