Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to activate GPU computing in Google colab?

Tags:

python

torch

I'm a biginner in torch and python,

I was experimenting with some codes in machine learning that I found online using Google COlab and I got the following error:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-4-d4b0db6cedae> in <module>()
    295                         input_dropout=input_dropout, hidden_dropout1=hidden_dropout1,
    296                         hidden_dropout2= hidden_dropout2, label_smoothing= label_smoothing)
--> 297 experiment.train_and_eval()
    298 
    299 

2 frames
/usr/local/lib/python3.6/dist-packages/torch/cuda/__init__.py in _lazy_init()
    195                 "Cannot re-initialize CUDA in forked subprocess. " + msg)
    196         _check_driver()
--> 197         torch._C._cuda_init()
    198         _cudart = _load_cudart()
    199         _cudart.cudaGetErrorName.restype = ctypes.c_char_p

RuntimeError: cuda runtime error (100) : no CUDA-capable device is detected at /pytorch/aten/src/THC/THCGeneral.cpp:50

I understand that cude is for GPU processing? So how can I fix the problem? I was experimenting with codes in this link:

like image 476
user42493 Avatar asked Feb 24 '26 21:02

user42493


1 Answers

Have you tried the following?

Go to Menu > Runtime > Change runtime.

Change hardware acceleration to GPU.

How to install CUDA in Google Colab GPU's

like image 75
oittaa Avatar answered Feb 27 '26 10:02

oittaa



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!