I have recently started using Jupyter Lab and my problem is that I work with quite large datasets (usually the dataset itself is approx. 1/4 of my computer RAM). After few transformations, saved as new Python objects, I tend to run out of memory. The issue is that when I'm approaching available RAM limit and perform any operation that needs another RAM space my computer freezes and the only way to fix it is to restart it. Is this a default behaviour in Jupyter Lab/Notebook or is it some settings I should set? Normally, I would expect the program to crash out (as in RStudio for example), not the whole computer
To restart the kernel,press Esc to enter command mode, then press 0 0 (zero) to restart the kernel, now your program will run. To avoid this in the future, remember to complete the execution of your program before editing the code.
Jupyter doesn't load or doesn't work in the browserTry in another browser (e.g. if you normally use Firefox, try with Chrome). This helps pin down where the problem is. Try disabling any browser extensions and/or any Jupyter extensions you have installed. Some internet security software can interfere with Jupyter.
If you load a file in a Jupyter notebook and store its content in a variable, the underlying Python process will keep the memory for this data allocated as long as the variable exists and the notebook is running. Python's garbage collector will free the memory again (in most cases) if it detects that the data is not needed anylonger.
All other notebooks you are connected to are also unresponsive. Loading too much data to a Jupyter notebook during a session (e.g. by printing 5000 rows in your notebook) can cause the notebook to refuse to open as well as make other notebooks unresponsive. If you can't even right click in your notebook, congrats, you just did it :D.
Another way is to run the jupyter notebook using CLI directly. It will allow us to keep all the logging printed in the jupyter notebook files throughout execution. There are two choices of program to use for this purpose, runipy or nbconvert. To install runipy or nbconvert, we can use pip/conda.
You can specify how much memory to allocate to Jupyter, and if the container runs out of memory it's simply not a big deal (just remember to save frequently, but that goes without saying). This blog will get you most of the way there.
Absolutely the most robust solution to this problem would be to use Docker containers. You can specify how much memory to allocate to Jupyter, and if the container runs out of memory it's simply not a big deal (just remember to save frequently, but that goes without saying).
This blog will get you most of the way there. There are also some decent instructions setting up Jupyter Lab from one of the freely available, officially maintained, Jupyter images here:
https://medium.com/fundbox-engineering/overview-d3759e83969c
and then you can modify the docker run
command as described in the tutorial as (e.g. for 3GB):
docker run --memory 3g <other docker run args from tutorial here>
For syntax on the docker memory options, see this question:
What unit does the docker run "--memory" option expect?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With