Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AWS: How to copy multiple file from local to s3?

I am trying to upload multiple files from my local to an AWS S3 bucket,
I am able to use aws s3 cp to copy files one by one,
But I need to upload multiple but not all ie. selective files to the same S3 folder,
Is it possible to do this in a single AWS CLI call, if so how?

Eg -

aws s3 cp test.txt s3://mybucket/test.txt

Reference -
https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html

like image 468
Ani Avatar asked Sep 03 '19 05:09

Ani


Video Answer


2 Answers

If you scroll down the documentation link you provided to the section entitled "Recursively copying local files to S3", you will see the following:

When passed with the parameter --recursive, the following cp command recursively copies all files under a specified directory to a specified bucket and prefix while excluding some files by using an --exclude parameter. In this example, the directory myDir has the files test1.txt and test2.jpg

So, assuming you wanted to copy all .txt files in some subfolder to the same bucket in S3, you could try something like:

aws s3 cp yourSubFolder s3://mybucket/ --recursive

If there are any other files in this subfolder, you need to add the --exclude and --include parameters (otherwise all files will be uploaded):

aws s3 cp yourSubFolder s3://mybucket/ --recursive --exclude "*" --include "*.txt"
like image 121
Tim Biegeleisen Avatar answered Oct 07 '22 09:10

Tim Biegeleisen


If you're doing this from bash, then you can use this pattern as well:

for f in *.png; do aws s3 cp $f s3://my/dest; done

You would of course customize *.png to be your glob pattern, and the s3 destination.

If you have a weird set of files you can do something like put their names in a text file, call it filenames.txt and then:

for f in `cat filenames.txt`; do ... (same as above) ...
like image 2
Tyler Avatar answered Oct 07 '22 11:10

Tyler