Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Set content-encoding to specific files via aws command

I deploy static application by aws command. I copy all files from my folder to s3 bucket by this command :

aws s3 sync C:\app s3://myBucket

I want to set content-encoding to gzip just to js,jpg,and html files. I succeed to do it for all folder by this command : --content-encoding gzip How can I do it just for the specific files type ?

like image 548
BrBr Avatar asked Feb 12 '23 11:02

BrBr


1 Answers

This is old, but I needed to find a way to do this: I'm not using Cloudfront, and the proxy we are using doesn't handle gzip... so:

  1. Exclude all files
  2. Include individual file types as needed
  3. Set the appropriate encoding/options

Below I'm also adding access control and cache-control, and deleting any files in s3 not present in the local directory

I've separated the JS/CSS from all of the images, html but that is probably not necessary.

I did however have a lot of trouble by not explicitly setting the content-encoding/cache for each individual --include, so I've set it like below to make it clearer.

The AWS docs that I could find don't mention any of this stuff

aws s3 sync ./dist/ s3://{bucket} \
--exclude "*" \
--include "static/*.ico" --acl "public-read" --cache-control no-cache \
--include "static/*.png" --acl "public-read" --cache-control no-cache \
--include "*.html" --acl "public-read" --cache-control no-cache \
--include "static/img/*.svg" --acl "public-read" --cache-control no-cache \
--delete \

aws s3 sync ./dist/ s3://{bucket} \
--exclude "*" \
--include "static/js/*.js" --acl "public-read" --cache-control  no-cache --content-encoding gzip \
--include "static/css/*.css" --acl "public-read" --cache-control no-cache --content-encoding gzip \
--delete \ 

Pretty neat little speed improvement for serving only from s3.

like image 67
comfytoday Avatar answered Mar 02 '23 20:03

comfytoday