Currently I am working on a data set that is of 10 GB. I have uploaded it on google cloud storage but I don't know how to import it in google colab.
Click on “Choose Files” then select and upload the file. Wait for the file to be 100% uploaded. You should see the name of the file once Colab has uploaded it. Finally, type in the following code to import it into a dataframe (make sure the filename matches the name of the uploaded file).
Google Colab is a free cloud service that offers Jupyter Notebooks via remote servers. Students can use GPU and TPU resources from Google to run their Python code using Google Colab. For a quick introduction, Google's Colab intro notebook is great.
from google.colab import auth
auth.authenticate_user()
Once you run this, a link will be generated, you can click on it and get the signing in done.
!echo "deb http://packages.cloud.google.com/apt gcsfuse-bionic main" > /etc/apt/sources.list.d/gcsfuse.list
!curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key add -
!apt -qq update
!apt -qq install gcsfuse
Use this to install gcsfuse on colab. Cloud Storage FUSE is an open source FUSE adapter that allows you to mount Cloud Storage buckets as file systems on Colab, Linux or macOS systems.
!mkdir folderOnColab
!gcsfuse folderOnBucket/content/ folderOnColab
Use this to mount the directories. (folderOnBucket is the GCS bucket URL without the gs:// part)
You can use this docs for further reading. https://cloud.google.com/storage/docs/gcs-fuse
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With