Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Where do I set cache information for my images?

This is about a Rails app on Heroku that runs behind CloudFront and serves ActiveStorage images from the Bucketeer add-on.

Cache config in both the Rails app itself and CloudFront are right on target for css, js, and even key, important requests (like search results, 3rd party info fetched from APIs, etc).

What I can't figure out how to cache are the images that come from the Bucketeer add-on.

Right now the images seem to come from the Bucketeer bucket every time. They show up with no Cache TTL.

I'd like for them to be cached for up to a year both at the CloudFront level and the visitor's browser level.

Is this possible?

It seems like the Bucketeer add-on itself gives us no control over how the bucket and/or the service handles caching.

Where can I force these files to show up with caching instructions?

like image 535
vivipoit Avatar asked Mar 04 '23 03:03

vivipoit


2 Answers

Thanks for sharing your findings here

Additionally, I found that S3Service accepts upload options https://github.com/rails/rails/blob/6-0-stable/activestorage/lib/active_storage/service/s3_service.rb#L12

So you can add the following code to your storage.yml

s3:
  service: S3
  access_key_id: ID
  secret_access_key: KEY
  region: REGION
  bucket: BUCKET
  upload:
    cache_control: 'public, max-age=31536000'

For a full list of available options refer to AWS SDK

like image 88
Alex Suslyakov Avatar answered Mar 17 '23 11:03

Alex Suslyakov


After a lot of searching, I learned that Bucketeer does give bucket control. You just have to use AWS CLI.

Here is the link to AWS docs on CLI: https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html

And here is the link where Bucketeer tells you how to get started with that on their service: https://devcenter.heroku.com/articles/bucketeer#using-with-the-aws-cli

This means you can install AWS CLI, do the aws configure with the credentials Bucketeer provides, and then go on to change cache-control in the bucket directly.

AWS does not seem to have a feature for setting cache-control defaults for an entire bucket or folder, so you actually do it to each object.

In my case, all of my files/objects in the bucket are images that I display on the website and need to cache, so it's safe to run a command that does it all at once.

Such a command can be found in this answer: How to set expires headers to all images in a bucket in Amazon S3

For me, it looked like this: aws s3 cp s3://my-bucket-name s3://my-bucket-name --recursive --acl public-read --metadata-directive REPLACE --cache-control max-age=43200000

The command basically copies the entire bucket onto itself while adding the cache-control max-age=43200000 header to each object in the process.

This works for all existing files, but will not change anything for future changes or additions. You'd have to run this again every so often to catch new stuff and/or write code to set your object headers when saving the object to the bucket. Apparently there are people that have had luck with this. Not me.

Thankfully, I found this post: https://www.neontsunami.com/posts/caching-variants-with-activestorage

This monkey-patch basically changes ActiveStorage::RepresentationsController#show to use Rails action caching for variants. Take a look. If you're having similar issues, it's worth the read.

There are drawbacks. For my case, they were not a problem, so this is the solution I went with.

like image 41
vivipoit Avatar answered Mar 17 '23 12:03

vivipoit