Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Determining part size and queue size parameters for AWS S3 upload

Tags:

I'm working with the Node.js AWS SDK for S3. I have a zip file output stream that I'd like to upload to S3 bucket. Seems simple enough reading the docs. But I noticed there are optional part size and queue size parameters, I was wondering what exactly are these? Should I use them? If so how do I determine appropriate values? Much appreciated.

like image 965
Daniel Kobe Avatar asked Aug 06 '16 22:08

Daniel Kobe


People also ask

At what size does does AWS recommend customers to use the multi part upload tool when uploading object to S3?

After all parts of your object are uploaded, Amazon S3 assembles these parts and creates the object. In general, when your object size reaches 100 MB, you should consider using multipart uploads instead of uploading the object in a single operation.

What is the size limit when using the single put upload method in S3?

Upload a single object using the Amazon S3 Console—With the Amazon S3 Console, you can upload a single object up to 160 GB in size.

What is the minimum file size that you can upload into S3?

The size of an object in S3 can be from a minimum of 0 bytes to a maximum of 5 terabytes, so, if you are looking to upload an object larger than 5 gigabytes, you need to use either multipart upload or split the file into logical chunks of up to 5GB and upload them manually as regular uploads.


1 Answers

This is a late response. Multiple parts can be queued and sent in parallel, the size of this parts is the parameter partSize The queueSize parameter is how many parts you can process. The max memory usage is partSize * queueSize so i think the values you are looking depends on the memory available in your machine.

like image 174
azibi Avatar answered Sep 22 '22 16:09

azibi