Is it possible to automate gsutil based file upload to google cloud store so that the user intervention is not required for login?
My usecase is to have a jenkins job which polls a SCM location for changes to a set of files. If it detects any changes it will upload all files to a specific Google Cloud Store bucket.
"gcloud" can create and manage Google Cloud resources while "gsutil" cannot do so. "gsutil" can manipulate buckets, bucket's objects and bucket ACLs on GCS(Google Cloud Storage) while "gcloud" cannot do so.
gsutil performs all operations, including uploads and downloads, using HTTPS and transport-layer security (TLS).
The gsutil cp command allows you to copy data between your local file system and the cloud, within the cloud, and between cloud storage providers. For example, to upload all text files from the local directory to a bucket, you can run: gsutil cp *.txt gs://my-bucket. You can also download data from a bucket.
After you configure your credentials once gsutil requires no further intervention. I suspect that you ran gsutil configure
as user X but Jenkins runs as user Y. As a result, ~jenkins/.boto
does not exist. If you place the .boto file in the right location you should be all set.
Another alternative is to use multiple .boto files and then tell gsutil which one to use with the BOTO_CONFIG
environment variable:
gsutil config # complete oauth flow
cp ~/.boto /path/to/existing.boto
# detect that we need to upload
BOTO_CONFIG=/path/to/existing.boto gsutil -m cp files gs://bucket
I frequently use this pattern to use gsutil with multiple accounts:
gsutil config # complete oauth flow for user A
mv ~/.boto user-a.boto
gsutil config # complete oauth flow for user B
mv ~/.boto user-b.boto
BOTO_CONFIG=user-a.boto gsutil cp a-file gs://a-bucket
BOTO_CONFIG=user-b.boto gsutil cp b-file gs//b-bucket
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With