I know that gzipping files before sending them across the network saves bandwidth, and for static files that can be cached, it is not a significant impact on server-side CPU usage.
But what about the client? They have to gunzip whatever files are sent, which will take CPU time. Additionally, I'm worried that the entire file must be received and gunzipped before any parsing can take place.
This strikes me as odd because I see two scenarios:
1) client has fast internet --> gzip is relevant 2) client has slow internet --> gzip prevents partial parsing
Clearly the exact speed-up (or slow-down?) will depend on exact circumstances of the files being transferred and the client. However, I'm curious what the time cost (or how can I measure the cost) on the client-side?
Gzip is a fast and easy way to improve page speed performance while still delivering a high-quality experience to your users. See if your website supports gzip by running a free speed test, and sign up for a free trial for more insights into your website's performance.
Used mostly on code and text files, gzip can reduce the size of JavaScript, CSS, and HTML files by up to 90%.
The reason is that gzip operates on (in terms of CPU speed vs HD seek speed these days) extremely low buffer sizes. It reads a few KB from from the input file, compresses it, and flushes it to the output file. Given the fact that this requires a hard drive seek, only a few operations can be done per seconds.
First, tar + gzip compresses better than zip, since the compression of the next file can use history from the previous file (sometimes referred to as a "solid" archive). zip can only compress files individually.
They have to gunzip whatever files are sent, which will take CPU time.
Perhaps, but the CPU time spent on decompression is extremely small compared to all the other things going on when loading a page (parsing, styling, rendering, scripting).
I'm worried that the entire file must be received and gunzipped before any parsing can take place.
Don't worry, gzip is a "stream" of data and the complete file is not required to begin decompression/parsing.
Specifically I want to know how I can gauge how much time is lost because of gzipping.
Here is an interesting article where the author performs the type of test you're describing. The tools are available for download so that you can perform the same tests in your own environment.
The author concludes:
I guess there are very few cases where you shouldn’t use gzip your content. If your typical page is less than 100 bytes then gzipping it could hurt the client’s and the server’s performance. But no website —except maybe a few web-services— serves pages with a typical size of 100 bytes or less. So there’s no excuse for serving uncompressed HTML.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With