I am trying to upload multiple files from my local to an AWS S3 bucket,
I am able to use aws s3 cp
to copy files one by one,
But I need to upload multiple but not all ie. selective files to the same S3 folder,
Is it possible to do this in a single AWS CLI call, if so how?
Eg -
aws s3 cp test.txt s3://mybucket/test.txt
Reference -
https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html
If you scroll down the documentation link you provided to the section entitled "Recursively copying local files to S3", you will see the following:
When passed with the parameter --recursive, the following cp command recursively copies all files under a specified directory to a specified bucket and prefix while excluding some files by using an --exclude parameter. In this example, the directory myDir has the files test1.txt and test2.jpg
So, assuming you wanted to copy all .txt
files in some subfolder to the same bucket in S3, you could try something like:
aws s3 cp yourSubFolder s3://mybucket/ --recursive
If there are any other files in this subfolder, you need to add the --exclude
and --include
parameters (otherwise all files will be uploaded):
aws s3 cp yourSubFolder s3://mybucket/ --recursive --exclude "*" --include "*.txt"
If you're doing this from bash, then you can use this pattern as well:
for f in *.png; do aws s3 cp $f s3://my/dest; done
You would of course customize *.png
to be your glob pattern, and the s3
destination.
If you have a weird set of files you can do something like put their names in a text file, call it filenames.txt
and then:
for f in `cat filenames.txt`; do ... (same as above) ...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With