Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Upload zip archive files to S3 with node

I download a ziparchive from an api, which contains gzipped files, and i need to take the gz files and save to s3. Don't want to uncompress or anything. Just move to S3.

When i open the archive, it has a folder w/ random numbers, /12345/file1.gz, and many files, /12345/file2.gz, etc.

I've tried yauzl and adm-zip, but don't understand how to take each entry in the archive and just send to s3. I have s3-stream-upload package, which i can use to send. Just can't get it right. Thanks for any help

yauzl.open("output.zip", {lazyEntries: true}, function(err, zipfile) {
  if (err) console.error('zip err: ', err);
  console.log(zipfile);
  //upload.write(zipfile);
  zipfile.readEntry();
  zipfile.on("entry", function(entry) {

      // file entry
      zipfile.openReadStream(entry, function(err, readStream) {
        if (err) console.error('readstream err: ', err);
        readStream.on("end", function() {
          zipfile.readEntry();
        });
        console.log(entry);
        readStream.pipe(upload) //upload is an s3-upload-stream
        .on('finish', function() { console.log('finished'); })
        .on('error', function(err) { console.error('stream err: ',err); });
      });

  });
});

This gives me write after end error, i think bcz the readstream is the actual data of the entries/files. been at this a while could use some help. thanks

like image 488
Ron Avatar asked Apr 19 '18 13:04

Ron


1 Answers

Answer was doing a straight s3 put with readStream as the body of the object...

yauzl.open("output.zip", {lazyEntries: true}, function(err, zipfile) {
  if (err) console.error('zip err: ', err);
  zipfile.readEntry();
  zipfile.on("entry", function(entry) {

      // file entry
      zipfile.openReadStream(entry, function(err, readStream) {
        if (err) console.error('readstream err: ', err);
        readStream.on("end", function() {
          zipfile.readEntry();
        });
        readStream.length = entry.uncompressedSize;

            s3.putObject({
                Bucket: "bark-data-team",
                Key: "amplitude-data/raw/" + startDate + "/" + entry.fileName,
                Body: readStream
            }, function(err, data) {
                console.log(err);
                console.log(data);
            });
      });

  });
});
like image 76
Ron Avatar answered Nov 13 '22 00:11

Ron