In my workplace, I have to manage many (currently tens, but probably hundreds eventually) Python web applications, potentially running various frameworks, libraries, etc (all at various versions). Virtualenv has been a lifesaver in managing that so far, but I'd still like to be able to manage things better, particularly when it comes to managing package upgrades.
I've thought of a few scenarios
Option 1: Install all required modules for each project in each virtualenv using pip, upgrade each individually as necessary. This would require a significant time cost for each upgrade and would require additional documentation to keep track of things. Might be facilitated by some management scripting.
Option 2: Install all libraries used by any application in a central repository, use symlinks to easily change versions once for all projects. Easy upgrades and central management, but forgoes some of the nicest benefits of using virtualenv in the first place.
Option 3: Hybridize the above two somehow, centralizing the most common libraries and/or those likely to need upgrades and installing the rest locally to each virtualenv.
Does anyone else have a similar situation? What's the best way to handle this?
You might consider using zc.buildout. It's more annoying to set up than plain pip/virtualenv, but it gives you more opportunities for automation. If the disk space usage isn't an issue, I'd suggest you just keep using individual environments for each project so you can upgrade them one at a time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With