Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python tasks and DAGs with different conda environments

Say that most of my DAGs and tasks in AirFlow are supposed to run Python code on the same machine as the AirFlow server.

Can I have different DAGs use different conda environments? If so, how should I do it? For example, can I use the Python Operator for that? Or would that restrict me to using the same conda environment that I used to install AirFlow.

More generally, where/how should I ideally activate the desired conda environment for each DAG or task?

like image 934
Amelio Vazquez-Reina Avatar asked Oct 14 '18 18:10

Amelio Vazquez-Reina


1 Answers

The Python that is running the Airflow Worker code, is the one whose environment will be used to execute the code.

What you can do is have separate named queues for separate execution environments for different workers, so that only a specific machine or group of machines will execute a certain DAG.

like image 170
Meghdeep Ray Avatar answered Nov 15 '22 03:11

Meghdeep Ray