Im trying to make a project that will upload google storage json file to BigQuery (just automate something that is done manually now).
And i'd like to use 'service account' for this as my script is going to be run on daily basis.
After reading everything i can found about using service account im still struggling to authenticate.
I wonder if someone could check and point me to what i missed?
Here is what i've done so far:
pip install --upgrade google-cloud-bigquery
export GOOGLE_APPLICATION_CREDENTIALS=<path_to_service_account_file>
with key path specified correctlyNow im trying to run the following python script:
from google.cloud import bigquery bigquery_client = bigquery.Client()
i get this error:
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credential and re-run the application. For more information, please see https://developers.google.com/accounts/docs/application-default-credentials.
Im quite new to both python and google cloud API so possbily missed something,
Wonder if someone could point out where/what was wrong in my steps above or point me to clear instruction for dummys about setting up and running simple script with Bigquery using service account?
User accounts With a user account, you can authenticate to Google APIs and services in the following ways: Use the gcloud CLI to set up Application Default Credentials (ADC). Use the gcloud CLI to generate access tokens. Use your user credentials to impersonate a service account.
I usually set this variable in the python script itself, something like:
import os from google.cloud.bigquery.client import Client os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = 'path_to_json_file' bq_client = Client()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With