Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Amazon Web Services S3 Request Limit

I'm using AWS to run some data processing. I have 400 spot instances in EC2 with 4 processes each, all of them writing to a single bucket in S3. I've started to get a (apparently uncommon) error saying:

503: Slow Down

Does anyone know what the actual request limit is for an S3 bucket? I cannot find any AWS documentation on it.

Thank you!

like image 600
Saul Avatar asked Jun 25 '13 20:06

Saul


People also ask

How many requests can S3 handle?

Amazon S3 now provides increased performance to support at least 3,500 requests per second to add data and 5,500 requests per second to retrieve data, which can save significant processing time for no additional charge.

Does S3 have a limit?

The total volume of data and number of objects you can store are unlimited. Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 TB. The largest object that can be uploaded in a single PUT is 5 GB.

How many get and put request are freely available in S3?

When you first start using Amazon S3 as a new customer, you can take advantage of a free usage tier. This gives you 5GB of S3 storage in the Standard Storage class, 2,000 PUT requests, 20,000 GET requests, and 15 GB of data transfer out of your storage “bucket” each month free for one year.

What is the size limit of S3 bucket policies?

Bucket policies are limited to 20 KB in size. You can use the AWS Policy Generator to create a bucket policy for your Amazon S3 bucket. You can then use the generated document to set your bucket policy by using the Amazon S3 console , through several third-party tools, or via your application.


3 Answers

AWS documents 503 as a result of temporary error. It does not reflect a specific limit.

According to "Best Practices for Using Amazon S3" section on handling errors (http://aws.amazon.com/articles/1904/):

500-series errors indicate that a request didn't succeed, but may be retried. Though infrequent, these errors are to be expected as part of normal interaction with the service and should be explicitly handled with an exponential backoff algorithm (ideally one that utilizes jitter). One such algorithm can be found at http://en.wikipedia.org/wiki/Truncated_binary_exponential_backoff.

Particularly if you suddenly begin executing hundreds of PUTs per second into a single bucket, you may find that some requests return a 503 "Slow Down" error while the service works to repartition the load. As with all 500 series errors, these should be handled with exponential backoff.

While less detailed, the S3 Error responses documentation does include 503 Slow Down (http://docs.aws.amazon.com/AmazonS3/latest/API/ErrorResponses.html).

like image 171
James Avatar answered Oct 07 '22 12:10

James


From what I've read, Slow Down is a very infrequent error. However, after posting this question I received an email from AWS that said the had capped my LIST requests to 10 requests per second because I had too many going to a specific bucket.

I had been using a custom queuing script for the project I am working on, which relied on LIST requests to determine the next item to process. After running into this problem I switched to AWS SQS, which was a lot simpler to implement than I'd thought it would be. No more custom queue, no more massive amount of LIST requests.

Thanks for the answers!

like image 29
Saul Avatar answered Oct 07 '22 12:10

Saul


To add to what James said, there are some internals about S3 partitioning that have been discussed and can be used to mitigate this in the future because exponential backoff is required.

See here: http://aws.typepad.com/aws/2012/03/amazon-s3-performance-tips-tricks-seattle-hiring-event.html

Briefly, don't store everything with the same prefix or there is a higher likelihood you will have these errors.Find some way to make the very first character in the prefix be as random as possible to avoid hotspots in S3's internal partitioning.

like image 5
prestomation Avatar answered Oct 07 '22 13:10

prestomation