I'm using Google Colab to train my model. After training, I want to change the model but I can't because there is not enough RAM for it. I tried to re-assign old model to None but RAM used didn't decrease.

I don't want to close the session and start from the beginning. Is there any way to free up RAM used in google colab?
I had this problem. I was looping through different models I was building and it helped me to clear the session from memory after each run, as per this other Stackoverflow contribution:
from tensorflow.keras import backend as K
K.clear_session()
For some other users this also helped:
tf.reset_default_graph()
It might also be, without you noticing, that your RAM gets exhausted because you are loading your data from a pandas dataframe. In such a case this might help you, too, more precisely adding the following lines under each loop cleared the memory in my case:
import gc
import pandas as pd
del(df)
gc.collect()
df=pd.DataFrame()
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With