Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Things to watch out for with Content-Encoding: gzip

I've created a static website that is hosted on a S3 Bucket. My asset files (css and js files) are minified and compressed with gzip. The filename itself is either file_gz.js or file_gz.css and is delivered with a Content-Encoding: gzip header.

So far, I've tested out the website on various browsers and it works fine. The assets are delivered with their compressed versions and the page doesn't look any different.

The only issue that I see is that since this is a S3 bucket, there is no failsafe for when the the client (the browser) doesn't support the gzip encoding. Instead the HTTP request will fail and there will be no styling or javascript-enhancements applied to the page.

Does anyone know of any problems by setting Content-Encoding: gzip? Do all browsers support this properly? Are there any other headers that I need to append to make this work properly?

like image 491
matsko Avatar asked Aug 28 '12 15:08

matsko


People also ask

What does content Encoding gzip mean?

The header “Content-encoding: gzip” means the contents were sent compressed.

Does gzip improve performance?

Gzip is a fast and easy way to improve page speed performance while still delivering a high-quality experience to your users. See if your website supports gzip by running a free speed test, and sign up for a free trial for more insights into your website's performance.

Should I use gzip compression?

GZIP is effective, but it's not the only compression method out there. In fact, it's not even the best method in terms of size reduction. GZIP can reduce the amount of data by up to 70%.

Where do I put content Encoding gzip?

The “content-encoding: gzip” HTTP Response Header You can open up Chrome DevTools or Firefox Developer Tools and look for this response header under the Network section.


2 Answers

Modern browsers support encoded content pretty much across the board. However, it's not safe to assume that all user agents will. The problem with your implementation is that it completely ignores HTTP's built-in method for avoiding this very problem: content negotiation. You have a couple of options:

  1. You can continue to close your eyes to the problem and hope that every user agent that accesses your content will be able to decode your gzip resources. Unfortunately, this will almost certainly not be the case; browsers are not the only user-agents out there and the "head-in-the-sand" approach to problem solving is rarely a good idea.

  2. Implement a solution to negotiate whether or not you serve a gzipped response using the Accept-Encoding header. If the client does not specify this header at all or specifies it but doesn't mention gzip, you can be fairly sure the user won't be able to decode a gzipped response. In those cases you need to send the uncompressed version.

The ins and outs of content negotiation are beyond the scope of this answer. You'll need to do some research on how to parse the Accept-Encoding header and negotiate the encoding of your responses. Usually, content encoding is accomplished through the use of third-party modules like Apache's mod_deflate. Though I'm not familiar with S3's options in this area, I suspect you'll need to implement the negotiation yourself.

In summary: sending encoded content without first clearing it with the client is not a very good idea.

like image 193
rdlowrey Avatar answered Oct 12 '22 23:10

rdlowrey


  1. Have you CSS / minfied CSS (example.css [247 kb]).
  2. Use cmd gzip -9 example.css and covert file will be like example.css.gz [44 kb].
  3. Rename the file example.css.gz to example.css.
  4. Upload the file into S3 bucket and in the properties click meta-data.
  5. Add new meta-data tag, select Context-Encoding and value gzip.
  6. Now your CSS will be minified and also gzip.

source: http://www.rightbrainnetworks.com/blog/serving-compressed-gzipped-static-files-from-amazon-s3-or-cloudfront/

like image 39
Cyril Prince Avatar answered Oct 13 '22 00:10

Cyril Prince