Scenario: there are multiple folders and many files stored in storage bucket that is accessible by project team members. Instead of downloading individual files one at a time (which is very slow and time consuming), is there a way to download entire folders? Or at least multiple files at once? Is this possible without having to use one of the command consoles? Some of the team members are not tech savvy and need to access these files as simple as possible. Thank you for any help!
gsutil -m cp -R gs://your-bucket .
I would suggest downloading the files with gsutil
. However if you have a large number of files to transfer you might want to use the gsutil -m
option, to perform a parallel (multi-threaded/multi-processing) copy:
gsutil -m cp -R gs://your-bucket .
The time reduction for downloading the files can be quite significant. See this Cloud Storage documentation for complete information on the GCS cp
command.
If you want to copy into a particular directory, note that the directory must exist first, as
gsutils
won't create it automatically. (e.g:mkdir my-bucket-local-copy && gsutil -m cp -r gs://your-bucket my-bucket-local-copy
)
I recommend they use gsutil. GCS's API deals with only one object at a time. However, its command-line utility, gsutil
, is more than happy to download a bunch of objects in parallel, though. Downloading an entire GCS "folder" with gsutil is pretty simple:
$> gsutil cp -r gs://my-bucket/remoteDirectory localDirectory
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With