I'm trying to import a local module (a python script) to my DAG.
Directory structure:
airflow/ ├── dag │ ├── __init__.py │ └── my_DAG.py └── script └── subfolder ├── __init__.py └── local_module.py
Sample code in my_DAG.py:
#trying to import from local module from script.subfolder import local_module #calling a function in local_module.py a = some_function()
I get an error in Airflow saying 'Broken DAG: my_DAG. No module named 'local_module'.
I've updated Airflow to 1.9.0 but this doesn't fix the issue.
Thanks.
You can do it in one of those ways: add your modules to one of the folders that Airflow automatically adds to PYTHONPATH. add extra folders where you keep your code to PYTHONPATH. package your code into a Python package and install it together with Airflow.
The __init__.py file makes Python treat directories containing it as modules. Furthermore, this is the first file to be loaded in a module, so you can use it to execute code that you want to run each time a module is loaded, or specify the submodules to be exported.
This usually has to do with how Airflow is configured.
In airflow.cfg
, make sure the path in airflow_home
is correctly set to the path the Airflow directory strucure is in.
Then Airflow scans all subfolders and populates them so that modules can be found.
Otherwise, just make sure the folder you are trying to import is in the Python path: How to use PYTHONPATH
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With