Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory usage is close to the limit in Google Colab

I'm using Google Colab to train my model. After training, I want to change the model but I can't because there is not enough RAM for it. I tried to re-assign old model to None but RAM used didn't decrease.

enter image description here

I don't want to close the session and start from the beginning. Is there any way to free up RAM used in google colab?

like image 419
Ha Bom Avatar asked Oct 31 '25 05:10

Ha Bom


1 Answers

I had this problem. I was looping through different models I was building and it helped me to clear the session from memory after each run, as per this other Stackoverflow contribution:

from tensorflow.keras import backend as K
K.clear_session()

For some other users this also helped:

tf.reset_default_graph()

It might also be, without you noticing, that your RAM gets exhausted because you are loading your data from a pandas dataframe. In such a case this might help you, too, more precisely adding the following lines under each loop cleared the memory in my case:

import gc
import pandas as pd

del(df)
gc.collect()
df=pd.DataFrame()
like image 199
NeStack Avatar answered Nov 03 '25 00:11

NeStack