I simply need to achieve below setup with node js script (generate the zip on the fly without ever touching disk and respond back to client to download). Can someone guide and post your working scripts. I tried googling, seems we can achieve it through zipstream. But didn't find any example/working script.
grab the files matching *.xml files from root folder.
Immediately writes to the client’s http response the http headers to say it’s a download and the file name is .zip.
zipstream writes the header bytes of zip container.
Creates an http request to the first image in S3.
Pipes that into zipstream (we don’t actually need to run deflate as the images are already compressed).
Pipes that into the client’s http response.
Repeats for each image, with zipstream correctly writing envelopes for each file.
zipstream writes the footer bytes for the zip container
Ends the http response.
Thanks,
Srinivas
I had the same requirement ... stream files from Amazon S3, zip them on the fly (in memory) and deliver to the browser through node.js. My solution involved using the knox and archiver packages and piping the archive's bytes to the result stream.
Since this is on the fly, you wont know the resulting archive size and therefore you cannot use the "Content-Length" HTTP header. Instead you'll have to use the "Transfer-Encoding: chunked" header.
The downside to "chunked" is you won't get a progress bar for the download. I've tried setting the Content-Length header to an approximate value, but this only works for Chrome and Firefox; IE corrupts the file; haven't tested with Safari.
var http = require("http");
var knox = require("knox");
var archiver = require('archiver');
http.createServer(options, function(req, res) {
var zippedFilename = 'test.zip';
var archive = archiver('zip');
var header = {
"Content-Type": "application/x-zip",
"Pragma": "public",
"Expires": "0",
"Cache-Control": "private, must-revalidate, post-check=0, pre-check=0",
"Content-disposition": 'attachment; filename="' + zippedFilename + '"',
"Transfer-Encoding": "chunked",
"Content-Transfer-Encoding": "binary"
};
res.writeHead(200, header);
archive.store = true; // don't compress the archive
archive.pipe(res);
client.list({ prefix: 'myfiles' }, function(err, data) {
if (data.Contents) {
var fileCounter = 0;
data.Contents.forEach(function(element) {
var fileName = element.Key;
fileCounter++;
client.get(element.Key).on('response', function(awsData) {
archive.append(awsData, {name: fileName});
awsData.on('end', function () {
fileCounter--;
if (fileCounter < 1) {
archive.finalize();
}
});
}).end();
});
archive.on('error', function (err) {
throw err;
});
archive.on('finish', function (err) {
return res.end();
});
}
}).end();
}).listen(80, '127.0.0.1');
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With