Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Private Content CORS issues with S3

The Problem

I've been scratching my head for quite a while trying to navigate many of the 3 letter acronym services surrounding the AWS ecosystem.

What I'm looking to do is display content (pdf, images, videos) on a website, in which that content can only be displayed on the website to authenticated users and not be downloaded or publicly accessible.

My issue is that no matter what I try, short of allowing complete public access to the bucket, gives me a 403 error when requesting image content from the front end.

The architecture of the application is a front-end JS app, served through a CloudFront distribution hosted on S3. This communicates with a backend hosted on EC2.

What I've tried so far

CORS Configuration

The documentation on the topic seems to allude to simply creating CORS rules to allow access to objects in the bucket from my website.

I created a CORS policy, similar to the following for the bucket, which should allow access to objects in the bucket:

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
    <AllowedOrigin>http://*.websitename.com</AllowedOrigin>
    <AllowedOrigin>https://*.websitename.com</AllowedOrigin>
    <AllowedMethod>GET</AllowedMethod>
    <AllowedMethod>HEAD</AllowedMethod>
    <MaxAgeSeconds>3000</MaxAgeSeconds>
    <AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>

No luck here, 403 error. As a test to see if it may be something other than the CORS configuration, I opened up the CORS policy to try and allow public CORS access to the bucket, as so:

<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
  <CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>GET</AllowedMethod>
    <AllowedMethod>HEAD</AllowedMethod>
    <MaxAgeSeconds>3000</MaxAgeSeconds>
    <AllowedHeader>*</AllowedHeader>
  </CORSRule>
</CORSConfiguration>

No luck here either, someone else on stackoverflow mentioned that there was a little documented issue with AllowedOrigin: *, and to try the following:

<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
  <CORSRule>
    <AllowedOrigin>http://*</AllowedOrigin>
    <AllowedOrigin>https://*</AllowedOrigin>
    <AllowedMethod>GET</AllowedMethod>
    <AllowedMethod>HEAD</AllowedMethod>
    <MaxAgeSeconds>3000</MaxAgeSeconds>
    <AllowedHeader>*</AllowedHeader>
  </CORSRule>
</CORSConfiguration>

One user points out on stackoverflow that direct-linked content doesn't send the Origin header by default, which means that S3 will then interpret it as an internal request and send the 403 forbidden error. The solution to this is adding crossorigin="anonymous" to the image tag.

After adding this attribute to the img tag, the headers are confirmed to be sent correctly. It is sending the correct Origin: mywebsitename.com in the request header, and I'm even receiving the correct response header of allowed origins including my website.

Cloudfront Distribution Issues

As per this answer, cloudfront by default doesn't forward the headers to S3. I've taken the step of forwarding all headers, including Access-Control-Allow-Methods and Access-Control-Allow-Origin.

In addition, another person suggested that cloudfront occasionally has weird caching issues. I've added a random query string to the image as well, with the same result.

After both, still 403.

I'm kind of at the end of my rope with this whole setup, and am considering other options like a proxy server on the backend that handles the authorization myself. While allowing all origins isn't the end goal for me, it should be working at this point. Does anyone have any ideas?

like image 930
Josh Alexy Avatar asked Dec 19 '18 16:12

Josh Alexy


1 Answers

content can only be displayed on the website to authenticated users and not be downloaded or publicly accessible

Your CF or S3 is treating everything as unauthenticated. Configure your S3 to allow all requests from CF and try a Lambda@Edge to check authentication and allow or deny the request. Another option is to use signed URL for S3 but that involves more work.

like image 196
Trinopoty Avatar answered Sep 19 '22 21:09

Trinopoty