Coming from JavaScript I'm familiar with NPM.
There you can install packages globally (by using the -g flag) or locally in a project. In Python they have these Virtual Environments. I'm still a bit uncertain why they are needed. I know that it is for having the same package in different versions on one machine.
Is it because Python doesn't have the concept of local project-installations?
All package-installations are installed global and there's no way around that. It seems to me being that way ... And so they have does Virtual Environments instead?
I'm a right there?
Virtual environments make possible for you to encapsulate dependencies by project.
Python has no node_modules
equivalent. When you install something with pip
it goes to your site-packages folder. To find out this folder you can run python -m site
and it will print out the folders where it will search for packages.
Example on Fedora 29:
➜ ~ python -m site
sys.path = [
'/home/geckos',
'/usr/lib/python27.zip',
'/usr/lib64/python2.7',
'/usr/lib64/python2.7/plat-linux2',
'/usr/lib64/python2.7/lib-tk',
'/usr/lib64/python2.7/lib-old',
'/usr/lib64/python2.7/lib-dynload',
'/usr/lib64/python2.7/site-packages',
'/usr/lib/python2.7/site-packages',
]
USER_BASE: '/home/geckos/.local' (exists)
USER_SITE: '/home/geckos/.local/lib/python2.7/site-packages' (doesn't exist)
ENABLE_USER_SITE: True
If you don't use virtual environments you may end up with packages being installed side by side with operating system python packages, and this is where the danger is. Packages may be overwritten and things get messy fast. For example you install Flask with pip then try to install Jinja2 from with package-manager, now you remove Jinja2, and breaks Flask, or you update your system, Jinja2 got updated but not Flask. Or even simpler, you install something with package manager and remove with pip, now the package manager is in a broken state.
Because of this we always use virtual environments, and even separate virtual environments by project.
Nothing prevents you from maintaining you virtual environment in the same folder as your project. This way you will have the same felling that you have with node_modules. You can create it with
virtualenv <SOME_FOLDER>
for python 2
or
python3 -m venv <SOME_FOLDER>
for python 3
If you're keeping virtual environments as a subfolder of your project, I usually call then env
or venv
Other options is keeping all then in the same folder inside your home, I've been using ~/.venv/<PROJECT>
Finally there is an alternative that I like more than pure pip
. Pipenv is a tool that manages virtual environments automatically for you. It feels more close to yarn and has more features
To create a virtual environment for a project just pipenv --tree
or pipenv --two
in your project folder. It will create and manage the virtual environment and write dependencies to Pipenv file. It also support development packages, I really think is worth trying. Here is the docs: https://pipenv.kennethreitz.org/en/latest/
I hope this helps, regards
Is it because Python doesn't have the concept of local project-installations?
Correct.
Well, mostly correct. There's a number of "modern" Python package managers that support project-local package installation. Right now the big two are pipenv and poetry.
However, all of these libraries are fundamentally wrappers over the basic Python virtual environment mechanism. It's the basis of the ecosystem.
Global package management is a little thorny in Python because Unix systems tend to come with a "system Python" installation that support parts of the operating system. Installing/updating packages in the system Python is a very bad idea, so you always want to be working in a Python you installed yourself, either a fully separate installation or at least a virtual environment of the system Python.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With