Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Jupyter notebook kernel constantly needs to be restarted

My installation of Jupyter Notebook only allows the execution of a few cells before becoming unresponsive. After moving to this "unresponsive mode", the execution of any cell, even a newly written one with a basic arithmetic command, will not noticeably execute or show output. Restarting the kernel is the only solution I've been able to find and that makes development painfully slow.

I'm running these versions for Jupyter 1, python 3.9, and I'm on windows 10. I've read the jupyter documentation and I can't find a reference to this issue. Similarily, there is no console output when Jupyter goes into "unresponsive mode". I've resolved all warnings shown in the console on startup.

I apologize for such a vague question. My issue is that I'm not quite sure what's gone wrong as well. I'm doing some basic data analysis with pandas:

%pylab
import pandas as pd
import glob
from scipy.signal import find_peaks

# Import data
dataFiles = glob.glob("Data/*.spe")
dataList = [pd.read_csv(f, names=[f]) for f in dataFiles]

# Join data into one DataFrame for ease
combinedData = pd.concat(dataList, axis=1, join="inner")

# Trim off arbitrary header and footers for each data run
lowerJunkRow = 12
upperJunkRow = 16395
combinedData = combinedData.truncate(before=lowerJunkRow, after=upperJunkRow)
combinedData.reset_index(drop=True, inplace=True)

# Cast dataFrame to integers
combinedData = combinedData.astype(int)

# Sum all counts by channel to aggregate data
combinedData["sum"] = combinedData.sum(axis=1)

Edit: I tried working in a different notebook with similar libraries and everything worked fine until I referenced a variable that I hadn't defined. The kernel then exhibited the same behavior as above. I tried saving my data in one combined csv file to avoid the large amount of memory the above code generates, but no dice. I also experience the same issue in Jupyter Lab which leads me to believe it's a kernel issue.

like image 358
Carter Avatar asked Dec 06 '25 03:12

Carter


1 Answers

It seems to me that you are processing a very large quantity of data. It may be the case that there is simply a lot of processing to do - and the reason for the 'unresponsive' state is that your kernel is executing a cell which takes a lot of processing.

If you are attempting to concatenate multiple csv files, I suggest at least saving the concatenated dataframe as a csv. You can then check if this file exists (using the os module), and read in this csv instead of going through the rigmarole of concatenating everything again.

like image 101
Tytrox Avatar answered Dec 07 '25 19:12

Tytrox