I am using Node with lambda and the AWS Javascript SDK. I have a role attached to the lambda function that allows the access I need to do. I want to be able to accept user input of access and secret keys and update my AWS config to perform new actions with those updated credentials. So far
let AWS = require("aws-sdk"); // I do the normal import
let ddb = new AWS.DynamoDB({apiVersion: '2012-10-08'}); // do some dynamo action
....
Then use these keys that have rights to another accounts resources
AWS.config = new AWS.Config({
accessKeyId: data.accessKey,
secretAccessKey: data.secretAccessKey
});
When I perform a new task it just uses the permissions provided with the lambda role and not the updated AWS creds. Any ideas?
You can create a Role in account B and permit your User (in account A) to assume it. Create a Role in account A that will be used by your AWS Lambda function. Create a Role in account B with a role type of Role for Cross-Account Access. Assign the desired permissions to use Route 53 in account B.
Before lambda layers, developers used to either duplicate common code in every lambda function or create local npm packages and refer them in lambdas. Now with lambda layers, you can securely share code among your lambda functions in the same AWS account, cross-accounts or in public.
Your code runs in an environment that includes the AWS SDK for JavaScript, with credentials from an AWS Identity and Access Management (IAM) role that you manage. Lambda supports the following Node.js runtimes. For end of support information about Node.js 10, see Runtime support policy .
Configure your Lambda function's execution role to allow the function to assume an IAM role in another AWS account. Modify your cross-account IAM role's trust policy to allow your Lambda function to assume the role.
Add the AWS Security Token Service (AWS STS) AssumeRole API call to your Lambda function's code. Note: A Lambda function can assume an IAM role in another AWS account to do either of the following: Access resources— For example, accessing an Amazon Simple Storage Service (Amazon S3) bucket.
This is the handler function that Lambda calls when the function is invoked. The Node.js function runtime gets invocation events from Lambda and passes them to the handler. In the function configuration, the handler value is index.handler .
When you update the AWS.config, it updates the AWS object. Any AWS Service objects (S3, EC2, DynamoDB, ...) objects created since then will have the updated credentials. It will not update any service objects created before the update to AWS.config.
As AWS Guru @johnrotenstein suggested, you should create your service object after updating the config. If you ddb
object is already created at this time, just redeclare it as a new DynamoDB({...})
const AWS = require('aws-sdk')
AWS.config = new AWS.Config({
accessKeyId: data.accessKey,
secretAccessKey: data.secretAccessKey
})
let ddb = new AWS.DynamoDB({apiVersion: '2012-10-08'})
Another possibly simpler solution is to use the update method on the service object's config attribute as such:
const AWS = require('aws-sdk')
let ddb = new AWS.DynamoDB({apiVersion: '2012-10-08'})
ddb.config.update({accessKeyId: '', secretAccessKey: ''})
// ddb will now use the new credentials for future calls
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With