Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

"ConnectionPoolTimeoutException" when iterating objects in S3

I've been working for some time with aws java API with not so many problems. Currently I'm using the library 1.5.2 version.

When I'm iterating the objects inside a folder with the following code:

AmazonS3 s3 = new AmazonS3Client(new PropertiesCredentials(MyClass.class.getResourceAsStream("AwsCredentials.properties")));

String s3Key = "folder1/folder2";


String bucketName = Constantes.S3_BUCKET;
String key = s3Key +"/input_chopped/";

ObjectListing  current = s3.listObjects(new ListObjectsRequest()
        .withBucketName(bucketName)
        .withPrefix(key));

boolean siguiente  = true;

while (siguiente) {    

    siguiente &= current.isTruncated();
    contador += current.getObjectSummaries().size();

    for (S3ObjectSummary objectSummary : current.getObjectSummaries()) {        
        S3Object object = s3.getObject(new GetObjectRequest(bucketName, objectSummary.getKey()));
        System.out.println(object.getKey());
    }

    current=s3.listNextBatchOfObjects(current);

}

Gist: Link: https://gist.github.com/fgblanch/6038699 I'm getting the following exception:

INFO  (AmazonHttpClient.java:358) - Unable to execute HTTP request: Timeout waiting for connection from pool
org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
    at org.apache.http.impl.conn.PoolingClientConnectionManager.leaseConnection(PoolingClientConnectionManager.java:232)
    at org.apache.http.impl.conn.PoolingClientConnectionManager$1.getConnection(PoolingClientConnectionManager.java:199)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:456)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:784)
    at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:315)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:199)
    at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:2994)
    at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:918)
    at com.madiva.segmentacion.tests.ListaS3.main(ListaS3.java:177)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caught an AmazonClientException, which means the client encountered a serious internal problem while trying to communicate with S3, such as not being able to access the network.
Error Message: Unable to execute HTTP request: Timeout waiting for connection from pool

Any idea how to avoid this error. It only happens in folders with a number of object , in this case there were 463 files inside. Thanks

like image 780
Fgblanch Avatar asked Jul 22 '13 08:07

Fgblanch


People also ask

What is S3 latency?

AWS S3 provides a great performance. It automatically scales to high request rates, with a very low latency of 100–200 milliseconds.

Does S3 have object size limits?

Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 TB. The largest object that can be uploaded in a single PUT is 5 GB. For objects larger than 100 MB, customers should consider using the Multipart Upload capability.

How do I turn off the Internet on my Samsung Galaxy S3?

You don't need to close a 'connection", as there's no such thing as a continuous connection to S3 when using AmazonS3Client.

How many objects can an S3 bucket hold?

S3 provides unlimited scalability, and there is no official limit on the amount of data and number of objects you can store in an S3 bucket. The size limit for objects stored in a bucket is 5 TB.


1 Answers

I've found that S3Object opens a connection for each object. That are not liberated even if the object is garbage collected so it is needed to execute object.close(), in order to liberate the connection to the pool.

So the corrected code would be:

 for (S3ObjectSummary objectSummary : current.getObjectSummaries()) {        
        S3Object object = s3.getObject(new GetObjectRequest(bucketName, objectSummary.getKey()));             
        System.out.println(object.getKey());
        object.close();
    }
like image 134
Fgblanch Avatar answered Sep 18 '22 11:09

Fgblanch