I'm using gcloud configurations to handle my CLI access. (switching between them with gcloud config configurations activate <env_name>). I'm NOT using GOOGLE_APPLICATION_CREDENTIALS env var at all as I want to be able to switch between configurations/projects/accounts.
It works well with resources like google.cloud.firestore.Client() which takes the current configuration .
I'm trying to have authenticated calls between my (python) cloud functions. When I try to get the token using -
auth_req = google.auth.transport.requests.Request()
id_token = google.oauth2.id_token.fetch_id_token(auth_req, audience)
I'm getting google.auth.exceptions.DefaultCredentialsError: Neither metadata server or valid service account credentials are found. I'll note that in a real cloud function fetch_id_token works.
I am able to get the token using cli command gcloud auth print-identity-token, but I want to get it using the python google auth library so it will work on both my local machine (using functions-framework) and in a real cloud function.
Is it possible? am I approaching all of this in a wrong way?
btw I'm using a Linux machine.
I have been struggling with the same issue. For some reason, there is almost no documentation on how to do this, but i managed to find a working way based on the following question How to get a GCP Bearer token programmatically with python that asks for access token. I found that id tokens can be fetched in the same way.
Using the following snippet:
import google.auth
from google.auth.transport.requests import Request as GoogleAuthRequest
auth_req = GoogleAuthRequest()
creds, _ = google.auth.default()
creds.refresh(auth_req)
id_token = creds.id_token
You will be able to fetch the id_token from local default credentials, following the order established in the documentation https://google-auth.readthedocs.io/en/master/reference/google.auth.html
This also works with the metadata server if running on a cloud environment
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With