What is the browser's overhead to decompress a gzip server response of an average sized web page?
<1ms 1-3ms? more?
I'll assume that you mean 1.3M uncompressed. I get about 6 ms decompression time on one core of a 2 GHz i7.
If I assume 1/3 compression, an extra 7 Mbits needs to be transferred if not compressed. That will take more than 6 ms on a 1 Gbit/s link. 700 ms on a more typical 10 Mbit/s link.
gzip is a big win for HTTP transfers.
Using zlib implementation of gzip with default parameters.
On an internet facing server, Xeon cpu 2.66Ghz quad core, the gzip compression times are Less than 0.5mS up to 15Kb. 361Kb is 4.50mS and 1077Kb takes 13mS
I consider this still easily worth it however, as most of our traffic is heading out over wifi or 3G links, so transfer time far outweighs server delay.
The times are measured with code bracketing only the call to gzip routines and use nS precision timers, I changed the source to implement this. I was measuring this anyway, as I was trying to determine if caching gzip was worth the memory tradeoff, or was gzip fast enough anyway. In our case, I think we will gzip everything above about 200bytes, and aggresively cache gzip'd responses, especially for larger packets.
(@Mark adler, thanks for writing zlib)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With