I am using google colab on a dataset with 4 million rows and 29 columns. When I run the statement sns.heatmap(dataset.isnull()) it runs for some time but after a while the session crashes and the instance restarts. It has been happening a lot and I till now haven't really seen an output. What can be the possible reason ? Is the data/calculation too much ? What can I do ?
This can be done by inputting a certain code in the Google Colab cell and waiting for the Colab to crash. As soon as the platform crashes due to lack of RAM, the platform automatically shows an option which readers – Your Session Crashed After Using All Available RAM.
Ctrl+ Shift + i to open inspector view . Then goto console. It would keep on clicking the page and prevent it from disconnecting. It solved the issue for me.
Colab is 100% free, and so naturally it has some resource constraints. As you can see in the screenshot below, each instance of Colab comes with 12 GB of RAM (actually 12.7 GB, but 0.8 GB are already taken). That's plenty, especially considering that you don't need to pay for it.
I'm not sure what is causing your specific crash, but a common cause is an out-of-memory error. It sounds like you're working with a large enough dataset that this is probable. You might try working with a subset of the dataset and see if the error recurs.
Otherwise, CoLab keeps logs in /var/log/colab-jupyter.log
. You may be able to get more insight into what is going on by printing its contents. Either run:
!cat /var/log/colab-jupyter.log
Or, to get the messages alone (easier to read):
import json
with open("/var/log/colab-jupyter.log", "r") as fo:
for line in fo:
print(json.loads(line)['msg'])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With