I am trying to extract (tar.gz) a 2.2GB dataset on my google drive that I need to run models on Colab.
I use the command !tar -xf source.tar.gz -C destination to extract it to my desired directory.
After 30 minutes, it is extracted and all the files are properly extracted. I restart the session after a while and i see that I am missing more than half of the files. So I extract them again and I close my session, come back and see that almost all are missing.
How could I fix this? Also the google drive interface is very laggy and async from all the changes that are happening in the Colab.
I really need the GPU on colab. How do I resolve this issue?
I even tried using tf.keras.utils.get_file with the extract option on but I have lost most of my files again after i opened the notebook.
EDIT: Forgot to mention that it is shared with some other people with whom i am in the project with. Is it possible that there is not enough space and it stores them in memory while the session is running and is not able to fully move them to the drive?
You have the option to map your Google Drive into your Colab virtual machine:

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With