My production server has no access to the internet, so it's a bit a mess copying all the dependencies from my dev machine to the production/development server.
If I'd use virtualenv, I'd have all my dependencies in this environment. Doing this I'd also be able to deploy it on any machine, which has python & virtualenv installed.
But I've seen this rarely, and it seems kind of dirty. Am I wrong and this could be a good practice, or are there other ways to solve that nicely?
A virtual environment helps keep your project bundled together with a list of its dependencies. This makes it portable and easy for someone else to open your project and get it up and running without dozens of import errors.
Virtual Environment should be used whenever you work on any Python based project. It is generally good to have one new virtual environment for every Python based project you work on. So the dependencies of every project are isolated from the system and each other.
Three options I would consider:
Run your own PyPI mirror with the dependencies you need. You really only need to build the file layout and pull from your local server using the index-url
flag:
$ pip install --index-url http://pypi.beastcraft.net/ numpy
Build virtualenvs on the same architecture and copy those over as needed.
This works, but you're taking a risk on true portability.
Use terrarium to build virtual environments then bring those over (basically option 2 but with easier bookkeeping/automation).
I've done all of these and actually think that hosting your own PyPI mirror is the best option. It gives you the most flexibility when you're making a deployment or trying out new code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With