Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

FERNET_KEY configuration is missing when creating a new environment with the same DAGS

I'm using Composer (Airflow) in Google Cloud. I want to create a new environment and take my same DAGs and Variables from the old environment into the new one.

To accomplish this I do the following:

  • I check several of my variables and export them to a JSON file.
  • In my new environment I import this same JSON file.
  • I use gsutil and upload my same DAGs to the new environment

However, in the new environment, all of my DAGs are breaking, due to a FERNET_KEY configuration is missing. My best guess is that this is related to importing my variables that were encrypted using a separate Fernet key but I'm unsure.

Has anyone encountered this issue before? If so, how did you fix it?

like image 578
bpgeck Avatar asked Feb 19 '20 02:02

bpgeck


2 Answers

I can reliably reproduce the issue in Composer 1.9 / Airflow 1.10.6 by performing the following actions:

  1. Create a new Composer Cluster
  2. Upload a DAG that references an Airflow Connection
  3. Set an Environment Variable in Composer
  4. Wait for airflow-scheduler and airflow-worker to restart

Aside from the FERNET_KEY configuration is missing, the issue manifests itself with the following Airflow error banners:

Broken DAG: [/home/airflow/gcs/dags/MY_DAG.py] in invalid literal for int() with base 10: 'XXX'
Broken DAG: [/home/airflow/gcs/dags/MY_DAG.py] Expecting value: line 1 column 1 (char 0)

The root cause of the issue is that adding a new environment variable removes the AIRFLOW__CORE__FERNET_KEY environment variable from the airflow-scheduler and airflow-worker Kubernetes Deployment Spec Pod Templates:

 - name: AIRFLOW__CORE__FERNET_KEY
      valueFrom:
        secretKeyRef:
          key: fernet_key
          name: airflow-secrets

As a workaround, it's possible to apply a Kubernetes Deployment Spec Patch:

$ cat config/composer_airflow_scheduler_fernet_key_patch.yaml
spec:
  template:
    spec:
      containers:
      - name: airflow-scheduler
        env:
        - name: AIRFLOW__CORE__FERNET_KEY
          valueFrom:
            secretKeyRef:
              key: fernet_key
              name: airflow-secrets

$ kubectl patch deployment airflow-scheduler --namespace=$AIRFLOW_ENV_GKE_NAMESPACE --patch "$(cat config/composer_airflow_scheduler_fernet_key_patch.yaml)"

NOTE: This patch must also be applied to airflow-worker.

like image 73
Michael Andrews Avatar answered Nov 20 '22 18:11

Michael Andrews


We got the same error about FERNET_KEYs. I think there is a bug in new version (composer-1.9.0). They say 'The Fernet Key is now stored in Kubernetes Secrets instead of the Config Map.' Even if re-enter your connections again, they are not working no.

They already fix the issue in version 1.9.1:

https://cloud.google.com/composer/docs/release-notes

like image 4
zaknafein Avatar answered Nov 20 '22 16:11

zaknafein