Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to increase Jupyter notebook Memory limit?

I am using jupyter notebook with Python3 on windows 10. My computer has 8GB RAM and at least 4GB of my RAM is free.

But when I want to make a numpy ndArray with size 6000*6000 with this command: np.zeros((6000, 6000), dtype='float64') I got this : Unable to allocate array with shape (6000, 6000) and data type float64

I don't think this could use more then 100MB RAM. I tried to change the number to see what happens. The biggest array I can make is (5000,5000). Did I make a mistake in estimating how much RAM I need?

like image 749
Dawyi Avatar asked Sep 15 '19 20:09

Dawyi


People also ask

How do I increase memory allocated in jupyter notebook?

Remember to remove the '#' before the property value. 3) Save and run the jupyter notebook. It should now utilize the set memory value. Also, don't forget to run the notebook from inside the jupyter folder.

Does jupyter notebook have a memory limit?

JupyterLab (or AI Notebook) in GCP by default is set with a maximum of 3.2GB of memory. If the amount of data loaded into memory (i.e. by RAM) is large, the Jupyter Kernel will “die”.

How do I check my jupyter notebook memory limit?

The jupyter-resource-usage extension is part of the default installation, and tells you how much memory your user is using right now, and what the memory limit for your user is. It is shown in the top right corner of the notebook interface.


2 Answers

Jupyter notebook has a default memory limit size. You can try to increase the memory limit by following the steps:
1) Generate Config file using command:

jupyter notebook --generate-config
2) Open jupyter_notebook_config.py file situated inside 'jupyter' folder and edit the following property:
NotebookApp.max_buffer_size = your desired value
Remember to remove the '#' before the property value.
3) Save and run the jupyter notebook. It should now utilize the set memory value. Also, don't forget to run the notebook from inside the jupyter folder.

Alternatively, you can simply run the Notebook using below command:

 jupyter notebook --NotebookApp.max_buffer_size=your_value
like image 103
Divya Kaushik Avatar answered Oct 02 '22 05:10

Divya Kaushik


For Jupyter you need to consider 2 processes:

  1. The local HTTP server (which is based on Tornado)
  2. Kernel process (normally local but can be distributed and depends on your config).

max_buffer_size is a Tornado Web Server setting, corresponds to the Maximum amount of incoming data to buffer and defaults to 100MB (104857600). (https://www.tornadoweb.org/en/stable/httpserver.html)

Based on this PR, this value seems to have been increased to 500 MB in Notebook.

Tornado HTTP server does not allow to my knowledge to define the max memory, it runs as a Python3 process.

For the kernel, you should look at the command defined kernel spec.

An option to try would be this one

like image 34
gogasca Avatar answered Oct 02 '22 03:10

gogasca