I'm working with the Node.js AWS SDK for S3. I have a zip file output stream that I'd like to upload to S3 bucket. Seems simple enough reading the docs. But I noticed there are optional part size and queue size parameters, I was wondering what exactly are these? Should I use them? If so how do I determine appropriate values? Much appreciated.
After all parts of your object are uploaded, Amazon S3 assembles these parts and creates the object. In general, when your object size reaches 100 MB, you should consider using multipart uploads instead of uploading the object in a single operation.
Upload a single object using the Amazon S3 Console—With the Amazon S3 Console, you can upload a single object up to 160 GB in size.
The size of an object in S3 can be from a minimum of 0 bytes to a maximum of 5 terabytes, so, if you are looking to upload an object larger than 5 gigabytes, you need to use either multipart upload or split the file into logical chunks of up to 5GB and upload them manually as regular uploads.
This is a late response.
Multiple parts can be queued and sent in parallel, the size of this parts is the parameter partSize
The queueSize
parameter is how many parts you can process.
The max memory usage is partSize * queueSize
so i think the values you are looking depends on the memory available in your machine.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With