How do I upload data to Google BigQuery with gsutil, by using a Service Account I created in the Google APIs Console?
First I'm trying to upload data to Cloud Storage using gsutil, as that seems to be the recommended model. Everything works fine with gmail user approval, but it does not allow me to use a service account.
It seems I can use the Python API to get an access token using signed JWT credentials, but I would prefer using a command-line tool like gsutil with support for resumable uploads etc.
EDIT: I would like to use gsutil in a cron to upload files to Cloud Storage every night and then import them to BigQuery.
Any help or directions to go would be appreciated.
What do you need to so you can also use gsutil and bq? Run gcloud config export gsutil and gcloud config export bq.
A service account is a Google Account that is associated with your Google Cloud project. Use a service account to access the BigQuery API if your application can run jobs associated with service credentials rather than an end-user's credentials, such as a batch processing pipeline.
Google Cloud Storage just released a new version (3.26) of gsutil that supports service accounts (as well as a number of other features and bug fixes). If you already have gsutil installed you can get this version by running:
gsutil update
In brief, you can configure a service account by running:
gsutil config -e
See gsutil help config
for more details about using the config command.
See gsutil help creds
for information about the different flavors of credentials (and different use cases) that gsutil
supports.
Mike Schwartz, Google Cloud Storage Team
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With