Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unable to use airflow KubernetesPodOperator on development machine

I am trying to use KubernetesPodOperator for testing on development iMac (10.15.6) machine. Versions used for minikube and kubectl are shown below. the airflow version used apache-airflow[kubernetes]==1.10.11

I am not able to run any pods using the KubernetesPodOperator. I have 2 issues.

  1. If I set in_cluster=False, then I get FileNotFoundError: [Errno 2] No such file or directory: '/root/.kube/config' issue
  2. If I set in_cluster=True, then I get kubernetes.config.config_exception.ConfigException: Service host/port is not set.

Do notice I tried mounting ~./kube/config as volume using Secret and Volume from kubernetes secrets. This happened with both docker-desktop and minikube

  1. can you help me with what I am doing wring with mounting
  2. can you suggest solutions for the service port complains ?
$ kubectl version
Client Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.6-beta.0", GitCommit:"e7f962ba86f4ce7033828210ca3556393c377bcc", GitTreeState:"clean", BuildDate:"2020-01-15T08:26:26Z", GoVersion:"go1.13.5", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"17", GitVersion:"v1.17.3", GitCommit:"06ad960bfd03b39c8310aaf92d1e7c12ce618213", GitTreeState:"clean", BuildDate:"2020-02-11T18:07:13Z", GoVersion:"go1.13.6", Compiler:"gc", Platform:"linux/amd64"}
$ minikube version
minikube version: v1.12.1
commit: 5664228288552de9f3a446ea4f51c6f29bbdd0e0


from datetime import timedelta
from airflow import DAG

from airflow.operators.dummy_operator import DummyOperator
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow.kubernetes.volume import Volume
from airflow.kubernetes.volume_mount import VolumeMount
from airflow.kubernetes.secret import Secret
from airflow.kubernetes.pod import Port

from airflow.utils.dates import days_ago

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': days_ago(1),
    'email': ['[email protected]'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 0,
    'retry_delay': timedelta(minutes=24*60)
}

dag = DAG('kubernetes_sample', 
            default_args=default_args, 
            schedule_interval=timedelta(minutes=24*60))

start = DummyOperator(task_id='START', dag=dag)

secret_volume_config = {
    "secret": {
        "secretName": "local-dev-secrets"
    }
}
kubeconfig_volume_config = {
    "secret": {
        "secretName": "local-dev-secrets-kubeconfig"
    }
}

volumes = [
    Volume(name='local-dev-secrets', configs=secret_volume_config),
    Volume(name='local-dev-secrets-kubeconfig', configs=kubeconfig_volume_config),
]

volume_mounts = [
    VolumeMount('local-dev-secrets', mount_path='/secrets', sub_path=None, read_only=True),
    VolumeMount('local-dev-secrets-kubeconfig', mount_path='/root/.kube/config', sub_path=None, read_only=True),
    VolumeMount('local-dev-secrets-kubeconfig', mount_path='/kubeconfig', sub_path=None, read_only=True)
]

# secret_folder = Secret(deploy_type='volume', deploy_target="/secrets", secret='local-dev-secrets', key=None)
# secret_kubeconfig_file = Secret(deploy_type='volume', deploy_target="/root/.kube/config", secret='local-dev-secrets-kubeconfig', key=None)

ports = [Port('http', 80),Port('http', 443)]

current_task = KubernetesPodOperator(
                        task_id="current-task",
                        name="current-task",
                        namespace='default',dag=dag,
                        image="busybox:latest",
                        image_pull_policy='Always',
                        get_logs=True,
                        do_xcom_push=False, # local development
                        in_cluster=False,
                        is_delete_operator_pod=False,
                        # hostnetwork=True,
                        # ports=[ports],
                        # secrets=[secret_kubeconfig_file],
                        # config_file="/kubeconfig"
                        volumes=volumes,
                        volume_mounts=volume_mounts,
                        # cmds=["ls","-alth", "/"],
                        log_events_on_failure=True
                        )

current_task.set_upstream(start)

like image 846
bicepjai Avatar asked Sep 21 '25 11:09

bicepjai


1 Answers

Once I mounted the kubeconfig on the airflow scheduler container as volume. Things were working as expected

like image 170
bicepjai Avatar answered Sep 23 '25 15:09

bicepjai



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!