I connected my Google Drive to Google Colab with this code:
# Load the Drive helper and mount
from google.colab import drive
# This will prompt for authorization.
drive.mount('/content/drive')
Now when I want to read a series of folders containing images in my drive, it runs very slow compared to my pc!
And I've noticed that if I run the code a second time, folders which have been already read in the previous run are loaded faster.
Do you have any suggestions for this problem? Thanks.
The best method for me has been to compress the files, then decompress them into the VM disk.
Reading the file into the VM disk is SO much faster than reading each file individually from Drive.
Let's say you have the desired images or data in your local machine in a folder Data. Compress Data to get Data.zip and upload it to Drive (if you only have these files on Drive, you can compress them there as well).
Now, mount your drive and run the following command:
!unzip "/content/drive/My Drive/path/to/Data.Zip" -d "/content"
Now amend all your data paths to go through /content/Data, and reading your images will be much much faster.
**Adapted from my answer to a previous question, but I initially looked at both questions to no avail.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With