Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why are no Amazon S3 authentication handlers ready?

People also ask

How S3 authentication works?

The Amazon S3 REST API uses a custom HTTP scheme based on a keyed-HMAC (Hash Message Authentication Code) for authentication. To authenticate a request, you first concatenate selected elements of the request to form a string. You then use your AWS secret access key to calculate the HMAC of that string.

Can we access S3 bucket without access key?

You can access an S3 bucket privately without authentication when you access the bucket from an Amazon Virtual Private Cloud (Amazon VPC). However, make sure that the VPC endpoint used points to Amazon S3.

How do I authenticate AWS?

To authenticate from the console as a root user, you must sign in with your email address and password. As an IAM user, provide your account ID or alias, and then your user name and password. To authenticate from the API or AWS CLI, you must provide your access key and secret key.


Boto will take your credentials from the environment variables. I've tested this with V2.0b3 and it works fine. It will give precedence to credentials specified explicitly in the constructor, but it will pick up credentials from the environment variables too.

The simplest way to do this is to put your credentials into a text file, and specify the location of that file in the environment.

For example (on Windows: I expect it will work just the same on Linux but I have not personally tried that)

Create a file called "mycred.txt" and put it into C:\temp This file contains two lines:

AWSAccessKeyId=<your access id>
AWSSecretKey=<your secret key>

Define the environment variable AWS_CREDENTIAL_FILE to point at C:\temp\mycred.txt

C:\>SET AWS_CREDENTIAL_FILE=C:\temp\mycred.txt

Now your code fragment above:

import boto
conn = boto.connect_s3()

will work fine.


I'm a newbie to both python and boto but was able to reproduce your error (or at least the last line of your error.)

You are most likely failing to export your variables in bash. if you just define then, they're only valid in the current shell, export them and python inherits the value. Thus:

$ AWS_ACCESS_KEY_ID="SDFGRVWGFVVDWSFGWERGBSDER"

will not work unless you also add:

$ export AWS_ACCESS_KEY_ID

Or you can do it all on the same line:

$ export AWS_ACCESS_KEY_ID="SDFGRVWGFVVDWSFGWERGBSDER"

Likewise for the other value. You can also put this in your .bashrc (assuming bash is your shell and assuming you remember to export)


Following up on nealmcb's answer on IAM roles. Whilst deploying EMR clusters using an IAM role, I had a similar issue where at times (not every time) this error would come up whilst connecting boto to s3.

boto.exception.NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV1Handler']

The Metadata Service can timeout whilst retrieving credentials. Thus, as the docs suggest, I added a Boto section in the config and increased the number of retries to retrieve the credentials. Note that the default is 1 attempt.

import boto, ConfigParser
try:
    boto.config.add_section("Boto")
except ConfigParser.DuplicateSectionError:
    pass
boto.config.set("Boto", "metadata_service_num_attempts", "20")

http://boto.readthedocs.org/en/latest/boto_config_tut.html?highlight=retries#boto

Scroll down to: You can control the timeouts and number of retries used when retrieving information from the Metadata Service (this is used for retrieving credentials for IAM roles on EC2 instances)


I just ran into this problem while using Linux and SES, and I hope it may help others with a similar issue. I had installed awscli and configured my keys doing:

sudo apt-get install awscli
aws configure

This is used to setup your credentials in ~/.aws/config just like @huythang said. But boto looks for your credentials in ~/.aws/credentials so copy them over

cp ~/.aws/config ~/.aws/credentials

Assuming an appropriate policy is setup for your user with those credentials - you shouldn't need to set any environment variables.


I found my answer here.

On Unix: first setup aws config:

#vim ~/.aws/config
[default]
region = Tokyo
aws_access_key_id = xxxxxxxxxxxxxxxx
aws_secret_access_key = xxxxxxxxxxxxxxxxx

And set environment variables

export AWS_ACCESS_KEY_ID="aws_access_key_id"
export AWS_SECRET_ACCESS_KEY="aws_secret_access_key"

See latest boto s3 introduction:

from boto.s3.connection import S3Connection
conn = S3Connection(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)