I deployed a dag in Airflow (on GCP) but I receive error "No module named 'scipy'". How do I install packages in Airflow?
I've tried adding a separate DAG to run
def pip_install(package):
subprocess.call([sys.executable, "-m", "pip", "install", package])
def update_packages(**kwargs):
logging.info(list(sys.modules.keys()))
for package in PACKAGES:
pip_install(package)
I've tried writing pip3 install scipy on the shell of GCP;
I've tried adding pip install scipy to the image builder.
None of these approaches had any result.
To install a package that includes a setup.py file, open a command or terminal window and: cd into the root directory where setup.py is located. Enter: python setup.py install.
Basic dependencies between Airflow tasks can be set in the following ways: Using bitshift operators ( << and >> ) Using the set_upstream and set_downstream methods.
If you are using Cloud Composer on GCP, you should check https://cloud.google.com/composer/docs/how-to/using/installing-python-dependencies
Pass a requirements.txt
file to the gcloud
command-line tool. Format the file with each requirement specifier on a separate line.
Sample requirements.txt file:
scipy>=0.13.3
scikit-learn
nltk[machine_learning]
Pass the requirements.txt
file to the gcloud
command to set your installation dependencies.
gcloud composer environments update ENVIRONMENT-NAME \\
--update-pypi-packages-from-file requirements.txt \\
--location LOCATION
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With