I want to run a notebook that uses many header files defined in the directory. So basically I want to upload the entire directory to Google Colab so that I can run the notebook. But I am unable to find any such options and only able to upload files not complete folders. So can someone tell me how to upload entire directory to google colab?
I suggest you not to upload them just in Colab, since when you're restarting the runtime you will lose them (just need to re-upload them, but it can be an issue with very big datasets).
I suggest you to use the google.colab
package to manage files and folders in Colab. Just upload everything you need to your google drive, then import:
from google.colab import drive
drive.mount('/content/gdrive')
In this way, you just need to login to your google account through google authentication API, and you can use files/folders as if they were uploaded on Colab.
You can zip them, upload, then unzip it.
!unzip file.zip
The easiest way to do this, if the folder/file is on your local drive:
from zipfile import ZipFile
file_name = file_path
with ZipFile(file_name, 'r') as zip:
zip.extractall()
print('Done')
Downside: The files will be deleted after the runtime is over.
You can use some part of these steps if your file is on a Google Drive, just upload the zipped file to colab from Google Drive.
you can create a git repository and push the files and folders to it, and then can clone the repository in colaboratory with the command
!git clone https://github.com/{username}/{projectname}.git
i feel this method is faster. but if the file size is more than 100 mb you will have to zip the file or will have to add extentions to push it to github. for more information refer the link below.
https://help.github.com/en/github/managing-large-files/configuring-git-large-file-storage
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With