Say that most of my DAGs and tasks in AirFlow are supposed to run Python code on the same machine as the AirFlow server.
Can I have different DAGs use different conda environments? If so, how should I do it? For example, can I use the Python Operator for that? Or would that restrict me to using the same conda environment that I used to install AirFlow.
More generally, where/how should I ideally activate the desired conda environment for each DAG or task?
The Python that is running the Airflow Worker code, is the one whose environment will be used to execute the code.
What you can do is have separate named queues for separate execution environments for different workers, so that only a specific machine or group of machines will execute a certain DAG.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With