Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

gzip without server support?

I have written a CSS server which does minimization and basic parsing/var replacement. The server is using node.js.

I am wanting to gzip my response from this server. As told in IRC, node.js does not currently have a gzip lib, so I am attempting to do it manually from the command line (as I am only gzipping when not in cache).

I am pushing the file data out to a temp file and then using exec to call 'gzip -c -9 -q ' + tempFile. I get the compressed data back correctly (it seems), and send the proper Content-Encoding header as 'gzip', but Chrome reports:

Error 330 (net::ERR_CONTENT_DECODING_FAILED): Unknown error.

Also, some independent gzip testers online fail as well (not just Chrome).

I'm assuming this is something simple I do not know about generating gzip blocks for browsers, seeing as I have never tried to do it manually.

Any assistance would be helpful. The server is blazing fast, but I need to gzip the content to get the best performance for end users.

Thanks.

UPDATE I have verified my Content-Length is correct

like image 687
Spot Avatar asked Feb 22 '10 01:02

Spot


People also ask

When should you not use gzip?

If you take a file that is 1300 bytes and compress it to 800 bytes, it's still transmitted in that same 1500 byte packet regardless, so you've gained nothing. That being the case, you should restrict the gzip compression to files with a size greater than a single packet, 1400 bytes (1.4KB) is a safe value.

Do any browsers not support gzip?

All modern browsers can handle a gzip encoded response. In fact, if you look at their requests, they'll have a header that says something along the lines of Accept-Encoding: gzip which is their way of saying to the server that they can handle gzipped responses.

How do I enable gzip compression on my server?

Gzip on Windows Servers (IIS Manager)Open up IIS Manager. Click on the site you want to enable compression for. Click on Compression (under IIS) Now Enable static compression and you are done!

Is gzip supported by all browsers?

gzip compressionThis HTTP header is supported in effectively all browsers.


2 Answers

Node is still bleeding edge and seems not yet to have a good handling of binary data.

Node's string encodings are ascii, binary and utf8. [...] "binary" only look[s] at the first 8 bits of the 16bit JavaScript string characters. The problem is that strings according to ECMA are 16bit character strings. If you use UTF-8 (it's the default) there is some normalization when reading into the string, and this corrupts gzip. If you use ascii, it obviously won't work.

It will work if you use binary encoding both reading and writing. The upper 8 bits of a Javascript string character just are not being used. If not, try to send the files directly to client without any loading into Javascript strings, perhaps with the help of a proxy server in front of Node.

I myself hope that Google's V8 engine implements a true binary string datatype, something like this proposal http://groups.google.com/group/nodejs/browse_thread/thread/648a0f5ed2c95211/ef89acfe538931a1?lnk=gst&q=binary+type#ef89acfe538931a1

CommonJS is also proposing Binary/B, and since Node tries to follow CommonJS, there is some hope for the future.

Edit I just discovered the net2 branch of node which contains a binary buffer (see src/node_buffer.h). It is part of a complete overhaul of network it seems.

like image 134
nalply Avatar answered Sep 28 '22 00:09

nalply


Have you updated the Content-Length to match the gzipped size? It seems like that might screw up the decoding.

like image 22
Sionide21 Avatar answered Sep 28 '22 00:09

Sionide21