Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unable to connect aws s3 bucket using boto

AWS_ACCESS_KEY_ID = '<access key>'
AWS_SECRET_ACCESS_KEY = '<my secret key>'
Bucketname = 'Bucket-name' 
import boto
from boto.s3.key import Key
import boto.s3.connection
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY,
        host ='s3.ap-southeast-1.amazonaws.com',
        is_secure=True,               # uncommmnt if you are not using ssl
        calling_format = boto.s3.connection.OrdinaryCallingFormat(),
        )
bucket = conn.get_bucket(Bucketname)

Error:

  Traceback (most recent call last):
   File "uploads3.py", line 69, in <module>
    upload_hello_file_s3()
  File "uploads3.py", line 25, in upload_hello_file_s3
    bucket = conn.get_bucket(Bucketname)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 431, in get_bucket
    bucket.get_all_keys(headers, maxkeys=0)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/bucket.py", line 364, in get_all_keys
    '', headers, **params)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/bucket.py", line 321, in _get_all
    query_args=s)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 543, in make_request
    override_num_retries=override_num_retries)
  File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 937, in make_request
    return self._mexe(http_request, sender, override_num_retries)
  File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 899, in _mexe
    raise e
socket.gaierror: [Errno -2] Name or service not known

please help me to solve this problem there is no problem in bucket name and access key and secret key.

like image 910
MONTYHS Avatar asked Mar 17 '14 12:03

MONTYHS


People also ask

How do I connect my S3 to Boto?

There are at least 3 ways to authenticate with boto: First, you can include credentials (access key, secret key) in the connect_to_region() call. A second way is to define the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY and then don't supply credentials in the connect_to_region() call.

Why can't I connect to an S3 bucket using a interface VPC endpoint?

To troubleshoot this error, check the following: Verify the policy associated with the interface VPC endpoint and the S3 bucket. Verify that your network can connect to the S3 endpoints. Verify that your DNS can resolve to the S3 endpoints IP addresses.

Can't connect to S3 endpoint?

To troubleshoot this error, check the following: Confirm that you're using the correct AWS Region and Amazon S3 endpoint. Verify that your network can connect to those Amazon S3 endpoints. Verify that your DNS can resolve to those Amazon S3 endpoints.


2 Answers

The question is answered, but I wanted to include some additional info that helped me. Keep in mind latest boto is boto3, but I was stuck using Python 2.7 in a legacy environment.

Authentication

There are at least 3 ways to authenticate with boto: First, you can include credentials (access key, secret key) in the connect_to_region() call. A second way is to define the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY and then don't supply credentials in the connect_to_region() call. Finally, if using boto 2.5.1 or later, boto can use the IAM role for an instance to create temporary credentials.

For the first two, you need to use AWS console to create a user with access to a bucket. In the third method, create an IAM role with access to the bucket and assign it to the instance. The 3rd way is often the best because then you don't have to store credentials in source control, or manage credentials in the environment.

Accessing the Bucket

Now on to the mistake I made that caused the same message as the OP. The top level objects in S3 are buckets and everything below are keys. In my case the object I wanted to access was at s3:top-level/next-level/object. I tried to access it like this:

bucket = conn.get_bucket('top-level/next-level')

The point is that next-level is not a bucket but a key, and you'll get the "Name or service not known" message if the bucket doesn't exist.

like image 180
Brad Dre Avatar answered Sep 23 '22 18:09

Brad Dre


from boto3.session import Session

ACCESS_KEY='your_access_key'

SECRET_KEY='your_secret_key'

session = Session(aws_access_key_id=ACCESS_KEY,aws_secret_access_key=SECRET_KEY)

s3 = session.resource('s3')

my_bucket = s3.Bucket('bucket_name')

for s3_file in my_bucket.objects.all():

           print(s3_file.key)
like image 35
Abin Thomas Philip Avatar answered Sep 22 '22 18:09

Abin Thomas Philip