I have a relatively large Jupyter/Notebook (about 40GB of Pandas DFs in RAM). I'm running a Python 3.6 kernel installed with Conda.
I have about 115 cells that I'm executing. If I restart the kernel and run the cells, my whole notebook runs in about 3 minutes. If I re-run a simple cell that's not doing much work (i.e. a function definition), it takes an extremely long time to execute (~15 minutes).
I cannot find any documentation online that has Jupyer notebook installation best practices. My disk usage is low, available RAM is high and CPU load is very low.
My swap space does seem to be maxed out, but I'm not sure what would be causing this.
Any recommendations on troubleshooting a poor-performing Jupyter notebook server? This seems to be related to re-running cells only.
If you are trying to run deeplearning models use Google Colab or Paperspace it will work better and offers limited GPU for free. If you are running normal scripts etc I think problem is with the memory in your PC make sure you got enough disk space. Even if it's slow try re-installing again.
Try in another browser (e.g. if you normally use Firefox, try with Chrome). This helps pin down where the problem is. Try disabling any browser extensions and/or any Jupyter extensions you have installed. Some internet security software can interfere with Jupyter.
Memory and disk space required per user: 1GB RAM + 1GB of disk + . 5 CPU core.
If the Variable Inspector nbextension is activated, it might slow down the notebook when you have large variables in memory (such as your Pandas dataframes).
See: https://github.com/ipython-contrib/jupyter_contrib_nbextensions/issues/1275
If that's the case, try disabling it in Edit -> nbextensions config
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With