Due to certain enterprise limitations, I'm only able to access AWS through the command line, and I cannot set environment variables. I was wondering if there is any way to pass in my keys with the command in a manner like this:
aws s3 cp <file> s3://testbucket --aws-access-key <accesskey> --aws-secret-key <secretkey>
I noticed that this question is fairly similar, although it seems that the answers are either not applicable to my situation or referencing the ec2din command, which I could not translate into copying files to s3. I just get the response Unknown options: --aws-access-key,--aws-secret-key
.
Try this:
AWS_ACCESS_KEY_ID=AAAA AWS_SECRET_ACCESS_KEY=BBB aws s3 cp <file> s3://testbucket
This will set the keys for this command only. If you need the keys for the session, export them like below:
export AWS_ACCESS_KEY_ID=AAAA ; export AWS_SECRET_ACCESS_KEY=BBB ; aws s3 cp <file> s3://testbucket
Are you allowed to save the AK/SK to a file? (very much like an SSH private key would be saved in ~/.ssh/id_rsa for example)
If so, you can run the command aws configure
, which will prompt for your AK and SK (plus default region and default output format). The credentials will be saved to ~/.aws/credentials, and the region and output (if you chose to specify them) will be saved to ~/.aws/config.
If you are not allowed to write your credentials to a file, be careful with commands passing credentials through the command like - those credentials might get into a "command history" file! In some shells, you can configure so that adding a space in front of a command will prevent it from being written into the history file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With