Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

403 AccessDenied for Paperclip attachments uploaded to Amazon S3 using s3cmd tool

I recently moved over a thousand Paperlclip attachments from local storage to Amazon S3 using the s3cmd tool. You can find details on how I accomplished this here, but to summarize, I used the following command to migrate all the old attachments.

s3cmd sync my-app/public/system/ s3://mybucket 

I updated my codebase to make use of the new S3 bucket, I've tested the connection and everything works fine. In fact, I can upload new attachments to the remote S3 bucket through my application and view/download them no problem. However, it seems somewhere along the lines Paperclip and S3 aren't in sync with one another, all the attachments that I moved over to my s3 bucket (blurred out in the image below) are returning 403s if I try and access them through my application. But new attachments uploaded to the same bucket are loaded just fine.

permission denied for old attachments migrated to s3

I have an IAM group setup with the following configuration:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": "arn:aws:s3:::mybucket/*"
    }
  ]
}

Judging from the fact that I can upload new attachments to S3 I'm going to say the connection established just fine. I believe I setup s3cmd to connect to my bucket via a different IAM user account than the one accessing the bucket through my application. Could it perhaps be a permission issue? If so, can I change the permissions above in any way to grant access to those files?

I'm using the aws-sdk to integrate paperclip with S3.

Edit: I thought it might have been an ownership issue as I uploaded the files using admin keys rather than the ones I use in my application. I purged the bucket and resynced s3cmd after configuring it to use the same keys that the application was using. I'm met with the same result, however.

Edit2: I can verify my permissions & connection further by going into my production console for my application and interacting with my bucket manually. Everything is working perfectly fine, i.e. I can retrieve files that my browser returns 403s for.

> s3 = AWS::S3.new
=> <AWS::S3>
> bucket = s3.buckets['mybucket']
=> #<AWS::S3::Bucket:mybucket>  
> bucket.exists?
=> true
> image = bucket.objects["SomeFolder/SomeImage.jpg"]
=> <AWS::S3::S3Object:SomeFolder/SomeImage.jpg>
> puts image.read
=> ǃ���^�D��턣�����=m������f ... machine code
like image 434
Noz Avatar asked Jan 06 '14 22:01

Noz


1 Answers

It looks like your S3 bucket policy is not allowing Public Read access from Public users properly. Try something like:

{
  "Version": "2008-10-17",
  "Statement": [
    {
      "Sid": "AllowPublicRead",
      "Effect": "Allow",
      "Principal": {
        "AWS": "*"
      },
      "Action": [
        "s3:GetObject"
      ],
      "Resource": [
        "arn:aws:s3:::my-brand-new-bucket/*"
      ]
    }
  ]
}

The fact you are able to access these files as a public user when you manually apply the public read permissions confirms your bucket policy is not granting read access correctly.

When you use a public S3 URL to access the files, there is no authenticated user.

like image 167
Winfield Avatar answered Sep 27 '22 16:09

Winfield