Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Verifying S3 credentials w/o GET or PUT using boto3

Is there a way to verify a given set of S3 credentials has access to a specific bucket without doing an explicit PUT or GET of some sort?

Instantiating an s3.Client, s3.Resource or s3.Bucket object doesn't seem to verify credentials at all, let alone bucket access.

boto3 1.4.7. python 2.7.13.

We have automation and orchestration that automates bucket creation and I want to include a piece that verifies a user's access key and secret. I know the bucket exists at this point since I created it. The bucket is empty.

I want to verify a user has access w/o doing a PUT operation.

Thanks for any help.

* update *

I ended up doing this with an s3.Client object:

objects = client.list_objects(Bucket=cfg['bucket'])

Since the bucket is empty this is a lightweight operation and a one-liner for the most part. (wrapped in a try block)

like image 959
BenH Avatar asked Nov 01 '17 13:11

BenH


People also ask

How do I connect my S3 to boto3?

Uploading a file to S3 Bucket using Boto3 The upload_file() method requires the following arguments: file_name – filename on the local filesystem. bucket_name – the name of the S3 bucket. object_name – the name of the uploaded file (usually equal to the file_name )

How do I know if my S3 file is boto3?

Boto3 resource doesn't provide any method directly to check if the key exists in the S3 bucket. Hence, you can load the S3 object using the load() method. If there is no exception thrown, then the key exists. If there is a client error thrown and the error code is 404 , then the key doesn't exist in the bucket.


2 Answers

Yes, you can use IAM policy simulation for that. Here's an example:

import boto3

iam = boto3.client('iam')
sts = boto3.client('sts')

# Get the arn represented by the currently configured credentials
arn = sts.get_caller_identity()['Arn']

# Create an arn representing the objects in a bucket
bucket_objects_arn = 'arn:aws:s3:::%s/*' % 'my-test-bucket'

# Run the policy simulation for the basic s3 operations
results = iam.simulate_principal_policy(
    PolicySourceArn=arn,
    ResourceArns=[bucket_objects_arn],
    ActionNames=['s3:PutObject', 's3:GetObject', 's3:DeleteObject']
)
for result in results['EvaluationResults']:
    print("%s - %s" % (result['EvalActionName'], result['EvalDecision']))

You can find all the s3 actions here.

One caveat to this is that IAM is eventually consistent, so if you're creating users on the fly you still might have to wait a bit for the changes to propagate.

like image 55
Jordon Phillips Avatar answered Sep 21 '22 17:09

Jordon Phillips


Use head_bucket.

head_bucket(**kwargs)

This operation is useful to determine if a bucket exists and you have permission to access it.

If you don't have access, you will get an exception:

botocore.exceptions.ClientError: An error occurred (403) when calling the HeadBucket operation: Forbidden

Modify this Python 2.7 code to suit to your needs:

import boto3
s3 = boto3.client('s3')
try:
  s3.head_bucket(Bucket='mybucket')
  print 'Can access the bucket'
except:
  print 'Cannot access the bucket'
like image 25
helloV Avatar answered Sep 19 '22 17:09

helloV