I need to compress a buffer in Node.js with zlib at the extreme compression level. The outputted header should be 78 DA. Unless I'm missing something, the Node.js documentation doesn't really describe how to use the zlib.Deflate class. It doesn't accept any parameters.
http://nodejs.org/api/zlib.html#zlib_class_zlib_deflate
The node:zlib module can be used to implement support for the gzip , deflate and br content-encoding mechanisms defined by HTTP. The HTTP Accept-Encoding header is used within an HTTP request to identify the compression encodings accepted by the client.
Zlib is a built-in module in Node. js that provides compression or decompression properties on texts and buffers in a program.
zlib is now in wide use for data transmission and storage. For example, most HTTP transactions by servers and browsers compress and decompress the data using zlib, specifically HTTP header Content-Encoding: deflate means deflate compression method wrapped inside the zlib data format.
Use zlib.
createGzip
/createDeflate
to get an instance of the compressor you need, with options in an object.
If you want to do this all in-memory:
var zlib = require('zlib');
// create a new gzip object
var gzip = zlib.createGzip({
level: 9 // maximum compression
}), buffers=[], nread=0;
// attach event handlers...
gzip.on('error', function(err) {
gzip.removeAllListeners();
gzip=null;
});
gzip.on('data', function(chunk) {
buffers.push(chunk);
nread += chunk.length;
});
gzip.on('end', function() {
var buffer;
switch (buffers.length) {
case 0: // no data. return empty buffer
buffer = new Buffer(0);
break;
case 1: // only one chunk of data. return it.
buffer = buffers[0];
break;
default: // concatenate the chunks of data into a single buffer.
buffer = new Buffer(nread);
var n = 0;
buffers.forEach(function(b) {
var l = b.length;
b.copy(buffer, n, 0, l);
n += l;
});
break;
}
gzip.removeAllListeners();
gzip=null;
// do something with `buffer` here!
});
// and finally, give it data to compress
gzip.write(inputBuffer);
gzip.end();
Of course, if you're dealing with large amounts of data, stream the output to a file rather than buffering everything in memory.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With