I have more than one Python environment configured in my Debian OS. Is there a way to list all configured environments in Linux?
This is different from the possible duplicate as indicated in the comment below. I mean virtual environments created using virtualenv
only.
To see a list of the Python virtual environments that you have created, you can use the 'conda env list' command. This command will give you the names as well as the filesystem paths for the location of your virtual environments.
You can use the lsvirtualenv , in which you have two options "long" or "brief": "long" option is the default one, it searches for any hook you may have around this command and executes it, which takes more time. "brief" just take the virtualenvs names and prints it.
virtualenv-clone will be installed inside newenv. Now while logged-in as newenv we can create a copy of any existing environment. For example creating the copy of ProjectAenv: (newenv): virtualenv-clone ProjectAenv ProjectBenv (newenv): deactivate # to come out from newenv.
If only using the lowly virtualenv ...{directory}
to create a virtualenv, then there is just some directory somewhere that has that specific environment in it. You can only "list" these by running find
on your $HOME
directory (or any other list of directories you might have used to create virtualenvs) looking for python installations. Hopefully some convention was followed like storing them all in ~/virtualenvs
. (See also Where should virtualenvs be created? )
If using virtualenvwrapper
, then as mentioned, use the command lsvirtualenv
to list envs that were created with mkvirtualenv
. They are all in ~/.virtualenvs
by default. See https://virtualenvwrapper.readthedocs.io/en/latest/command_ref.html
If using conda
, you can list virtual envs created via conda create --name {my_env} [...]
, using either conda info --envs
or conda env list
. See https://conda.io/docs/using/envs.html#list-all-environments
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With