I installed airflow 2.0 using docker swarm and Celery Executor.
After 1 week, celery workers memory is overflowing by airflow task supervisor (screenshot attached)
Anyone faced such issues ? Any suggestions ?

In Airflow 2.0, there are 2 ways of creating child processes.
By default, airflow 2.0 uses (1) method. Forking the parent process is faster. On the other hand, the child process is not killed after the task completion. Number of child processes keep increasing till the memory exhaused.
I switched to subprocess method (2) by setting execute_tasks_new_python_interpreter = True. Here, each python process is killed and everytime new process is created. This might be slow but memory is effectively utilised.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With