Say you have an airflow DAG that doesn't make sense to backfill, meaning that, after it's run once, running it subsequent times quickly would be completely pointless.
For example, if you're loading data from some source that is only updated hourly into your database, backfilling, which occurs in rapid succession, would just be importing the same data again and again.
This is especially annoying when you instantiate a new hourly task, and it runs N
amount of times for each hour it missed, doing redundant work, before it starts running on the interval you specified.
The only solution I can think of is something that they specifically advised against in FAQ of the docs
We recommend against using dynamic values as start_date, especially
datetime.now()
as it can be quite confusing.
Is there any way to disable backfilling for a DAG, or should I do the above?
Please notice that if the DAG is currently running, the Airflow scheduler will start again the tasks you delete. So either you stop the DAG first by changing its state or stop the scheduler (if you are running on a test environment).
The scheduler, by default, will kick off a DAG Run for any data interval that has not been run since the last data interval (or has been cleared). This concept is called Catchup.
Backfill. Backfilling in Airflow addresses the final use case we presented in the Overview section: we have a DAG already deployed and running, and realize we want to use that DAG to process data prior to the DAG's start date. Backfilling is the concept of running a DAG for a specified historical period.
Upgrade to airflow version 1.8 and use catchup_by_default=False in the airflow.cfg or apply catchup=False to each of your dags.
https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#catchup_by_default
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With