Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

S3cmd sync returns "killed"

I am trying to sync a few large buckets on amazon S3.

When I run my S3cmd sync --recursive command I get a response saying "killed".

Does anyone know what this may refer to? Is there a limit on the number of files that can be synced in S3?

Thx for your help

like image 214
Spyros Lambrinidis Avatar asked Feb 27 '13 08:02

Spyros Lambrinidis


People also ask

How do I upload a file greater than 100 megabytes on Amazon S3?

Instead of using the Amazon S3 console, try uploading the file using the AWS Command Line Interface (AWS CLI) or an AWS SDK. Note: If you use the Amazon S3 console, the maximum file size for uploads is 160 GB. To upload a file that is larger than 160 GB, use the AWS CLI, AWS SDK, or Amazon S3 REST API.

Is multipart upload faster?

Amazon S3 provides a faster, easier and flexible method to upload larger files, known as “multipart upload” feature. This feature allows you to break the larger objects into smaller chunks and upload a number of chunks in parallel. If any of the chunks fails to upload, you can restart it.

At what size file should you use multi part upload?

The size of an object in S3 can be from a minimum of 0 bytes to a maximum of 5 terabytes, so, if you are looking to upload an object larger than 5 gigabytes, you need to use either multipart upload or split the file into logical chunks of up to 5GB and upload them manually as regular uploads.

Which is the maximum S3 object size for upload in a single PUT operation?

Upload an object in a single operation using the AWS SDKs, REST API, or AWS CLI—With a single PUT operation, you can upload a single object up to 5 GB in size. Upload a single object using the Amazon S3 Console—With the Amazon S3 Console, you can upload a single object up to 160 GB in size.


1 Answers

After reading around it looks like the program has memory consumption issues. In particular this can cause the OOM killer (out of memory killer) to take down the process and prevent the system from getting bogged down. A quick look at dmesg after the process is killed will generally show if this is the case or not.

With that in mind I would ensure you're on the latest release, which notes memory consumption issues being solved in the release notes.

like image 145
cwgem Avatar answered Sep 24 '22 06:09

cwgem