Using apache airflow, I created some DAGS, some of which do not run on a schedule.
I'm trying to find a way that I can trigger a run for a specific DAG from within a Python script. Is this possible? How can I do?
EDIT --- The python script will be running from a different project from the project where all my DAGS are located
In order to create a Python DAG in Airflow, you must always import the required Python DAG class. Following the DAG class are the Operator imports. Basically, you must import the corresponding Operator for each one you want to use. To execute a Python function, for example, you must import the PythonOperator.
We can use the BranchPythonOperator to define two code execution paths, choose the first one during regular operation, and the other path in case of an error. In the other branch, we can trigger another DAG using the trigger operator.
You have a variety of options when it comes to triggering Airflow DAG runs.
The airflow python package provides a local client you can use for triggering a dag from within a python script. For example:
from airflow.api.client.local_client import Client
c = Client(None, None)
c.trigger_dag(dag_id='test_dag_id', run_id='test_run_id', conf={})
You can trigger dags in airflow manually using the Airflow CLI. More info on how to use the CLI to trigger DAGs can be found here.
You can also use the Airflow REST api to trigger DAG runs. More info on that here.
The first option from within python might work for you best (it's also how I've personally done it in the past). But you could theoretically use a subprocess to interact with the CLI from python, or a library like requests to interact with the REST API from within Python.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With