I am having trouble downloading multiple files from AWS S3 buckets to my local machine.
I have all the filenames that I want to download and I do not want others. How can I do that ? Is there any kind of loop in aws-cli I can do some iteration ?
There are couple hundreds files I need to download so that it seems not possible to use one single command that takes all filenames as arguments.
In the Buckets list, choose the name of the bucket that you want to upload your folders or files to. In a window other than the console window, select the files and folders that you want to upload. Then, drag and drop your selections into the console window that lists the objects in the destination bucket.
If you have Visual Studio with the AWS Explorer extension installed, you can also browse to Amazon S3 (step 1), select your bucket (step 2), select al the files you want to download (step 3) and right click to download them all (step 4).
In the Amazon S3 console, choose your S3 bucket, choose the file that you want to open or download, choose Actions, and then choose Open or Download. If you are downloading an object, specify where you want to save it.
Also one can use the --recursive
option, as described in the documentation for cp
command. It will copy all objects under a specified prefix recursively.
Example:
aws s3 cp s3://folder1/folder2/folder3 . --recursive
will grab all files under folder1/folder2/folder3 and copy them to local directory.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With