I am trying to setup a Google Cloud Platform connection in Google Cloud Composer using the service account key. So I created a GCS bucket and put the service account key file in the bucket. The key is stored in JSON. In the keyfile path field I specified a GCS bucket, and in the keyfile JSON field I specified the file name. The scopes is https://www.googleapis.com/auth/cloud-platform.
When trying to use this connection to start a Dataproc cluster, I got the error that JSON file can not be found.
Looking at the error message, the code tries to parse the file using:
with open(filename, 'r') as file_obj
which obviously won't work with a GCS bucket path.
So my question is, where should I put this service account key file if it can not be put in a GCS path?
You can store Airflow connections in external secrets backends like HashiCorp Vault, AWS SSM Parameter Store, and other such services.
I'm assuming you want your operators using a service account distinct from the default auto-generated compute account that Composer runs under.
The docs indicate that you can add a new Airflow Connection for the service account, which includes copy-pasting the entire JSON key file into the Airflow Connection config (look for Keyfile JSON
once you select the Google Cloud Platform
connection type).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With