I have a fairly large private python package I just finished creating. I'd like to install it as part of my build process for an app in a Docker container (though this isn't so important). The package source is quite large, so ideally I'd avoid downloading/keeping the whole source.
Right now, I've been just passing around the package source along with my app, but this is unwieldy and hopefully temporary. What's a better way? git submodule/subtree? I'm pretty new to this.
Inside that package directory, alongside your python files, create a file called __init__.py. This file can be empty, and it denotes the directory as a python package. When you pip install, this directory will be installed and become importable.
When you have a requirements.txt
file and you want to do pip install -r requirements.txt
in your Dockerfile, what I do is:
GITHUB_TOKEN
env var. Put it in .env
docker-compose.yml
fileRUN git config --global url."https://${GITHUB_OAUTH_TOKEN}@github.com/".insteadOf "https://github.com/"
That allows to patch all URLs coming from requirements.txt
If you don't want to mess with ssh keys use http and token authentication, but you don't want to expose your token in your repo I suggest to pass the token as an environment variable.
Add to your Dockerfile:
ENV GITOKEN "$GITOKEN"
RUN bash -c "pip install -r <(envsubst < requirements.txt)"
Set your requirements as this:
+plotly==2.7.0
+git+https://[email protected]/githubuser/your_package.git@master
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With