Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to download multiple files in Google Cloud Storage

Scenario: there are multiple folders and many files stored in storage bucket that is accessible by project team members. Instead of downloading individual files one at a time (which is very slow and time consuming), is there a way to download entire folders? Or at least multiple files at once? Is this possible without having to use one of the command consoles? Some of the team members are not tech savvy and need to access these files as simple as possible. Thank you for any help!

like image 613
G Lee Avatar asked Aug 25 '16 05:08

G Lee


People also ask

Which command is used to download the multiple file in GCP?

gsutil -m cp -R gs://your-bucket .


2 Answers

I would suggest downloading the files with gsutil. However if you have a large number of files to transfer you might want to use the gsutil -m option, to perform a parallel (multi-threaded/multi-processing) copy:

gsutil -m cp -R gs://your-bucket .

The time reduction for downloading the files can be quite significant. See this Cloud Storage documentation for complete information on the GCS cp command.

If you want to copy into a particular directory, note that the directory must exist first, as gsutils won't create it automatically. (e.g: mkdir my-bucket-local-copy && gsutil -m cp -r gs://your-bucket my-bucket-local-copy)

like image 162
Kevin Katzke Avatar answered Oct 21 '22 20:10

Kevin Katzke


I recommend they use gsutil. GCS's API deals with only one object at a time. However, its command-line utility, gsutil, is more than happy to download a bunch of objects in parallel, though. Downloading an entire GCS "folder" with gsutil is pretty simple:

$> gsutil cp -r gs://my-bucket/remoteDirectory localDirectory
like image 39
Brandon Yarbrough Avatar answered Oct 21 '22 19:10

Brandon Yarbrough