I'm new to Google Cloud Platform.I have trained my model on datalab and saved the model folder on cloud storage in my bucket. I'm able to download the existing files in the bucket to my local machine by doing right-click on the file --> save as link. But when I try to download the folder by the same procedure as above, I'm not getting the folder but its image. Is there anyway I can download the whole folder and its contents as it is? Is there any gsutil command to copy folders from cloud storage to local directory?
Cloud Shell runs in a container. The home directory is mapped to a Cloud Storage location owned by and managed by Google. You have no access to that storage location outside of Cloud Shell.
Offline download to cloudEnter the URL of the web file you want to save, and click on the cloud service you wish to upload web file to, for instance, Google Drive. Click on “Save to Cloud” to start to upload web file to cloud. Tips: When editing the file name, please enter a file name with its extension.
You can find docs on the gsutil tool here and for your question more specifically here.
The command you want to use is:
gsutil cp -r gs://bucket/folder .
This is how you can download a folder from Google Cloud Storage Bucket
Run the following commands to download it from the bucket storage to your Google Cloud Console local path
gsutil -m cp -r gs://{bucketname}/{folderPath} {localpath}
once you run that command, confirm that your folder is on the localpath by running ls
command to list files and directories on the localpath
Now zip your folder by running the command below
zip -r foldername.zp yourfolder/*
Once the zip process is done, click on the more dropdown menu at the right side of the Google Cloud Console,
then select "Download file" Option. You will be prompted to enter the name of the file that you want to download, enter the name of the zip file - "foldername.zp"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With