As title, we can set pypi packages in requirements.txt file and use the command
gcloud beta composer environments update env_name --update-pypi-packages-from-file requirements.txt --location location
to update the cloud composer environment.
But does it support to install custom github repo in requirements.txt? I've tried adding the link like:
pkg_name @ git+ssh://[email protected]/my_account/pkg_repo.git#master
and it doesn't work.
Thanks!
Update: I have a workaround solution is to put the library into plugins. But the best strategy in our case is to install a package from github I think.
No it does not support to install custom libraries this way. In docs here you can see how --update-pypi-packages
is intended to be used only with libraries which are in the Python Package Index:
You can install Python dependencies from the Python Package Index through the Google Cloud Platform Console or by making a partial update request to the environment using the gcloud command-line tool.
You have three alternative options:
Install a local Python library.
Use the plugins feature.
Use the KubernetesPodOperator.
Installing using a local Python library is quite straightforward:
1. In your composer GCS bucket, in the dags folder create a dependencies
folder and inside there add your library modules (don't forget to add the necessary "init.py" file(s)).
2. Then simply do something like:
from dependencies import your_module
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With