I've created a static website that is hosted on a S3 Bucket. My asset files (css and js files) are minified and compressed with gzip. The filename itself is either file_gz.js
or file_gz.css
and is delivered with a Content-Encoding: gzip
header.
So far, I've tested out the website on various browsers and it works fine. The assets are delivered with their compressed versions and the page doesn't look any different.
The only issue that I see is that since this is a S3 bucket, there is no failsafe for when the the client (the browser) doesn't support the gzip encoding. Instead the HTTP request will fail and there will be no styling or javascript-enhancements applied to the page.
Does anyone know of any problems by setting Content-Encoding: gzip
? Do all browsers support this properly? Are there any other headers that I need to append to make this work properly?
The header “Content-encoding: gzip” means the contents were sent compressed.
Gzip is a fast and easy way to improve page speed performance while still delivering a high-quality experience to your users. See if your website supports gzip by running a free speed test, and sign up for a free trial for more insights into your website's performance.
GZIP is effective, but it's not the only compression method out there. In fact, it's not even the best method in terms of size reduction. GZIP can reduce the amount of data by up to 70%.
The “content-encoding: gzip” HTTP Response Header You can open up Chrome DevTools or Firefox Developer Tools and look for this response header under the Network section.
Modern browsers support encoded content pretty much across the board. However, it's not safe to assume that all user agents will. The problem with your implementation is that it completely ignores HTTP's built-in method for avoiding this very problem: content negotiation. You have a couple of options:
You can continue to close your eyes to the problem and hope that every user agent that accesses your content will be able to decode your gzip resources. Unfortunately, this will almost certainly not be the case; browsers are not the only user-agents out there and the "head-in-the-sand" approach to problem solving is rarely a good idea.
Implement a solution to negotiate whether or not you serve a gzipped response using the Accept-Encoding
header. If the client does not specify this header at all or specifies it but doesn't mention gzip, you can be fairly sure the user won't be able to decode a gzipped response. In those cases you need to send the uncompressed version.
The ins and outs of content negotiation are beyond the scope of this answer. You'll need to do some research on how to parse the Accept-Encoding
header and negotiate the encoding of your responses. Usually, content encoding is accomplished through the use of third-party modules like Apache's mod_deflate. Though I'm not familiar with S3's options in this area, I suspect you'll need to implement the negotiation yourself.
In summary: sending encoded content without first clearing it with the client is not a very good idea.
example.css
[247 kb]).gzip -9 example.css
and covert file will be like example.css.gz
[44 kb].example.css.gz
to example.css
.Context-Encoding
and value gzip
.source: http://www.rightbrainnetworks.com/blog/serving-compressed-gzipped-static-files-from-amazon-s3-or-cloudfront/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With