I would like to be able to use the ~/.aws/credentials file I maintain with different profiles with my spark scala application if that is possible. I know how to set hadoop configurations for s3a inside my app but I don't want to keep using different keys hardcoded and would rather just use my credentials file as I do with different programs. I've also experimented with using java api such as val credentials = new DefaultAWSCredentialsProviderChain().getCredentials()
and then creating an s3 client but that doesn't allow me to use my keys when reading files from s3. I also know that keys can go in core-site.xml
when I run my app but how can I manage different keys and also how can I set it up with IntelliJ so that I can have different keys pulled in using different profiles?
DefaultAWSCredentialsProviderChain contains no providers by default. You need to add some, e.g.:
val awsCredentials = new AWSCredentialsProviderChain(new
auth.EnvironmentVariableCredentialsProvider(), new
auth.profile.ProfileCredentialsProvider(), new
auth.AWSCredentialsProvider())
You can use them with S3 client or, as you mention Spark:
hadoopConfig.set("fs.s3a.access.key", awsCredentials.getAWSAccessKeyId)
hadoopConfig.set("fs.s3a.secret.key", awsCredentials.getAWSSecretKey)
To switch between different AWS profiles you could then switch between profiles by setting the AWS_PROFILE environment variable. Happy to expand on any particular point if needed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With