Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

get_bucket() gives 'Bad Request' for S3 buckets I didn't create via Boto

I'm using Boto to try to get a bucket in Amazon S3, but it returns Bad Request when I use get_bucket() for some of the buckets. I'm starting to wonder if this is a bug with Boto, since I can get the bucket using get_all_buckets().

>>> from boto.s3.connection import S3Connection
>>> conn = S3Connection(S3_ACCESS_KEY, S3_SECRET_KEY)
>>> buckets = conn.get_all_buckets()
>>> buckets
[<Bucket: mysite-backups>]
>>> buckets[0]
<Bucket: mysite-backups>
>>> conn.get_bucket('mysite-backups')
Traceback (most recent call last):
  File "<console>", line 1, in <module>
  File "/path/to/virtualenv/lib/python2.7/site-packages/boto/s3/connection.py", line 502, in get_bucket
    return self.head_bucket(bucket_name, headers=headers)
  File "/path/to/virtualenv/lib/python2.7/site-packages/boto/s3/connection.py", line 549, in head_bucket
    response.status, response.reason, body)
S3ResponseError: S3ResponseError: 400 Bad Request

>>> conn.create_bucket('mysite_mybucket')
<Bucket: mysite_mybucket>
>>> conn.get_bucket('mysite_mybucket')
<Bucket: mysite_mybucket>

This seems to be an issue even if I log in with the same user account as I'm using the access creds for and create it from within the AWS console.

Any idea why this might be happening?

like image 719
seddonym Avatar asked Jan 29 '15 11:01

seddonym


4 Answers

Add s3 bucket host to boto connection

conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, host=AWS_HOST)

like image 69
Naveen Agarwal Avatar answered Nov 03 '22 08:11

Naveen Agarwal


Turns out the issue is because of the region (I was using Frankfurt). Two ways of dealing with it:

  1. Give up on Frankfurt (@andpei points out there are issues currently reported with it) and recreate the bucket in a different region.

  2. Specify the region using the 'host' parameter when connecting (thanks @Siddarth):

    >>> REGION_HOST = 's3.eu-central-1.amazonaws.com'
    >>> conn = S3Connection(S3_ACCESS_KEY, S3_SECRET_KEY, host=REGION_HOST)
    >>> conn.get_bucket('mysite-backups')
    <Bucket: mysite-backups>
    

    You can find the relevant region host here.

like image 27
seddonym Avatar answered Nov 03 '22 08:11

seddonym


Use connect to region while dealing with buckets in different regions.

like image 38
Siddarth Avatar answered Nov 03 '22 09:11

Siddarth


A general and simple solution, that does not involve changing region or setting a specific host, is found at https://github.com/boto/boto/issues/2916. After some editing:

The Frankfurt AWS region (Ireland and CN too, apparently) only support the V4 signature algorithm. (…)

Per the boto documentation, you can either add [s3] use-sigv4 = True to your ~/.boto file or set the os.environ list to include S3_USE_SIG_V4: os.environ['S3_USE_SIGV4'] = 'True'.

like image 23
Eric O Lebigot Avatar answered Nov 03 '22 09:11

Eric O Lebigot