I quite often find that this would be useful but I'm not sure there's any way to do this. I'm often working on a python project where I start the project with a virtual environment for the project and a Jupyter notebook. I start adding libraries to the virtual environment as I experiment in the Jupyter notebook. The problem is that if I run
pip freeze > requirements.txt
at the end of my project that file will include the libraries for jupyter in my virtual environment. Is there some way I can run a Jupyter notebook (e.g. in my base conda environment) but use a kernel associated with another virtual environment? This seems like the nicest solutions but I'm not sure if this is possible? I know I could probably do something to remove these from what is frozen but that seems like a hack. I can't see any way to avoid installing at least ipykernel in the target virtual environment
To use your new environment with Jupyter Notebooks, open the Notebook application. Click the New button to open a new notebook. In the drop-down menu under Notebooks, the environment you just created is displayed. To activate that environment, select it.
To set an env variable in a jupyter notebook, just use a % magic commands, either %env or %set_env , e.g., %env MY_VAR=MY_VALUE or %env MY_VAR MY_VALUE . (Use %env by itself to print out current environmental variables.)
Make a new directory under the Jupyter kernels directory. If you do not know where to find it, please check here (https://jupyter-client.readthedocs.io/en/stable/kernels.html).
Create a kernel.json
file with the following:
{
"argv": [ "/path-to-env/myenv/bin/python", "-m", "ipykernel",
"-f", "{connection_file}"],
"display_name": "myenv",
"language": "python"
}
jupyter notebook
and be able to see a kernel that uses your virtual environment.Here's a blog that explains it in more detail: https://www.alfredo.motta.name/create-isolated-jupyter-ipython-kernels-with-pyenv-and-virtualenv/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With