Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How should I pass my s3 credentials to Python lambda function on AWS?

I'd like to write a file to S3 from my lambda function written in Python. But I’m struggling to pass my S3 ID and Key.

The following works on my local machine after I set my local Python environment variables AWS_SHARED_CREDENTIALS_FILE and AWS_CONFIG_FILE to point to the local files I created with the AWS CLI.

session = boto3.session.Session(region_name='us-east-2') 
s3 = session.client('s3', 
     config=boto3.session.Config(signature_version='s3v4'))

And the following works on Lambda where I hand code my ID and Key (using *** here):

AWS_ACCESS_KEY_ID = '***'
AWS_SECRET_ACCESS_KEY = '***'
session = boto3.session.Session(region_name='us-east-2') 
s3 = session.client('s3', 
     config=boto3.session.Config(signature_version='s3v4'),
     aws_access_key_id=AWS_ACCESS_KEY_ID,
     aws_secret_access_key=AWS_SECRET_ACCESS_KEY)

But I understand this is insecure after reading best practices from Amazon. So I try:

AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
session = boto3.session.Session(region_name='us-east-2') 
s3 = session.client('s3', 
     config=boto3.session.Config(signature_version='s3v4'),
     aws_access_key_id=AWS_ACCESS_KEY_ID,
     aws_secret_access_key=AWS_SECRET_ACCESS_KEY)

But I get an error: “The AWS Access Key Id you provided does not exist in our records.” I also tried to define these variables in the Lambda console, but I then I get: "Lambda was unable to configure your environment variables because the environment variables you have provided contains reserved keys."

I am a little surprised I need to pass an ID or Key at all since I believe my account for authoring the Lambda function also has permission to write to the S3 account (the key and secret I hand code are from IAM for this same account). I got the same sense from reading the following post: AWS Lambda function write to S3

like image 721
David A Avatar asked Apr 01 '17 20:04

David A


People also ask

How do I get data from S3 bucket in Python Lambda?

Create a Lambda Function to transform data for your use case. Create an S3 Object Lambda Access Point from the S3 Management Console. Select the Lambda function that you created above. Provide a supporting S3 Access Point to give S3 Object Lambda access to the original object.

What is the most secure way to grant the Lambda function access to the S3 bucket and the Dynamodb table?

In order to grant a Lambda function access to a Dynamodb table, we have to attach an IAM policy to the function's execution role. The policy should grant permissions for all the Actions the function needs to perform on the table.

How does Lambda communicate with S3?

Amazon S3 can send an event to a Lambda function when an object is created or deleted. You configure notification settings on a bucket, and grant Amazon S3 permission to invoke a function on the function's resource-based permissions policy.


1 Answers

You never need to use AWS access keys when you are using one AWS resource within another. Just allow the Lambda function to access the S3 bucket and any actions that you want to take (e.g. PutObject). If you make sure the Lambda function receives the role with the policy to allow for that kind of access, the SDK takes all authentication out of your hands.

If you do need to use any secrets keys in Lambda, e.g. of 3rd party systems or an AWS RDS database (non-Aurora), you may want to have a look at AWS KMS. This works nicely together with Lambda. But again: using S3 in Lambda should be handled with the right role/policy in IAM.

like image 144
Bram Avatar answered Oct 05 '22 06:10

Bram