Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to increase Google Colab storage

I am working on a Dataset of 70gb

Earlier using df -BG command

I was being shown

Filesystem     1G-blocks  Used Available Use% Mounted on

overlay             359G    6G      335G   2% /

tmpfs                 7G    0G        7G   0% /dev

tmpfs                 7G    0G        7G   0% /sys/fs/cgroup

/dev/root             2G    1G        1G  44% /opt/bin

tmpfs                 7G    1G        7G   4% /usr/lib64-nvidia

/dev/sda1           365G    8G      358G   3% /etc/hosts

shm                   1G    0G        1G   0% /dev/shm

tmpfs                 7G    0G        7G   0% /sys/firmware

Suddenly Now it has changed to

Filesystem     1G-blocks  Used Available Use% Mounted on

overlay              40G    5G       33G  14% /

tmpfs                 7G    0G        7G   0% /dev

tmpfs                 7G    0G        7G   0% /sys/fs/cgroup

/dev/sda1            46G   40G        6G  88% /etc/hosts

shm                   1G    0G        1G   0% /dev/shm

tmpfs                 7G    0G        7G   0% /sys/firmware

Can someone suggest any possible way to make a new Notebook with more than 300Gbs Available or any possible way to go back to previous state.

like image 343
Abhik Sarkar Avatar asked May 09 '18 19:05

Abhik Sarkar


2 Answers

I had the same problem. I am not sure this is a solution since I haven't tested it thoroughly, but it seems like the [Python 2 / No GPU] and [Python 3 / No GPU] runtimes have only 40GB of storage, whereas the [Python 3 / GPU] runtime has 359GB of storage.

Try changing your notebook runtime type to [Python 3 / GPU] by going to "Runtime" > "Change runtime type". Hope it helps!

like image 58
ninjin Avatar answered Sep 26 '22 11:09

ninjin


If you pay for extra storage in google drive, you can mount drive into /content/drive/ folder

as Follows

from google.colab import drive
drive.mount('/content/drive')
> Then it will ask you for auth code

You can even use it for unziping datasets (My scenario was that I had enough space on Colab to download 18G of Coco Dataset but not enough space to unzip it)

!unzip /content/train2017.zip -d /content/drive/My\ Drive/COCO/train_2017
like image 35
Martin Avatar answered Sep 25 '22 11:09

Martin