I'm trying to understand how to copy local files to cloud storage using gsutil so I can write a script to move files. I followed the next steps:
C:\Program Files (x86)\Google\Cloud SDK>gsutil ls
gs://sa-upload-test/
C:\Program Files (x86)\Google\Cloud SDK>cd\spare
C:\Spare>gsutil cp *.txt gs://sa-upload-test
CommandException: No URLs matched: *.txt
I changed folder properties and set permissions to everyone, re-ran it and still get the same result. Can anyone tell me what I am missing?
Running the gsutil rm command above with the -DD flag shows that gsutil isn't even making an API call to check for the object in question.
If both the source and destination URL are cloud URLs from the same provider, gsutil copies data "in the cloud" (without downloading to and uploading from the machine where you run gsutil). In addition to the performance and cost advantages of doing this, copying in the cloud preserves metadata such as Content-Type and Cache-Control.
At the end of every upload or download, the gsutil cp command validates that the checksum it computes for the source file matches the checksum that the service computes. If the checksums do not match, gsutil deletes the corrupted object and prints a warning message. If this happens, contact [email protected].
gsutil writes data to a temporary directory in several cases: when compressing data to be uploaded (see the -z and -Z options) when decompressing data being downloaded (for example, when the data has Content-Encoding:gzip as a result of being uploaded using gsutil cp -z or gsutil cp -Z) when running integration tests using the gsutil test command
It seems gsutil doesn't recognize those files on your local system. Try refreshing your terminal.
In my case, I was uploading django static files so I recollected my static files and it worked.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With