Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

GZIP Compression on static Amazon S3 files

I would like to implement GZIP compression on my site. I've implemented it on IIS and the HTML page is compressed successfully as expected.

Now the issue is with CSS and JS file, which I get from Amazon S3. They are not at all compressed. I wanted to compress them too.

Please guide me how to do it. Sharing links for it help me a lot.

Update: I've added Meta Header on S3 files as "Content-Encoding:gzip", now its showing in Response header. Still the file size is same and no effect of Particular CSS in page. And i can't even open it in browser. Here is the [link][1] of particular css.

Thanks

like image 939
RaJesh RiJo Avatar asked Aug 04 '15 05:08

RaJesh RiJo


People also ask

Does S3 use gzip?

Most popular web server support serving contents using GZIP, at the same time most popular web browsers recognize GZIP header and decompress the files on the fly. Even though Amazon S3 has most of the features of a full-fledged web server, it lacks transparently supporting GZIP.

Does S3 support compression?

S3 does not support stream compression nor is it possible to compress the uploaded file remotely.

Is Brotli better than gzip?

Brotli has a better compression ratio (i.e. it produces smaller compressed files) across every level of compression. While GZIP does beat Brotli on speed most of the time, the level you compress at factors into the results you'll see.


2 Answers

Files should be compressed before being uploaded to Amazon S3.

For some examples, see:

  • Serving Compressed (gzipped) Static Files from Amazon S3 or Cloudfront
  • How to: Gzip compression of CSS and JS files on S3 with s3cmd
like image 131
John Rotenstein Avatar answered Oct 07 '22 20:10

John Rotenstein


If you use CloudFront in front of your S3 bucket, there is no need to manually compress HTML ressources (CloudFront will compress them on-the-fly). Please note CloudFront only compress in gzip (no deflate, brotli) and only CSS / JS / HTML (based on content-type). See https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html#compressed-content-cloudfront-file-types . To make it works, you have to forward some http headers from CloudFront to S3 (see doc).

If your S3 bucket have resources not supported by Cloudfront (generic "binary/octet-stream" mime type, like "hdr" texture or "nds" ROM), you need to compress them by yourself before uploading to S3, then set the "content-encoding" http meta on the resource. Note that only browsers supporting the gz encoding will be able to download and decompress the file.

If you don't want to compress the file one-by-one by the hand, you can use a Lambda function

  • triggered on each PUT of an object (a file) in the bucket
  • if the file is not already compressed and if compression is usefull, then replace the original uploaded file with the compressed version
  • set http headers content-encoding to gzip

I wrote a GIST for this, it can inspire you to create your own process. See https://gist.github.com/psa-jforestier/1c74330df8e0d1fd6028e75e210e5042

And dont forget to invalidate (=purge) Cloudfront to apply your change.

like image 10
JayMore Avatar answered Oct 07 '22 21:10

JayMore