Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

torch.cuda.is_available() returns false in colab

I am trying to use GPU in google colab. Below are the details of the versions of pytorch and cuda installed in my colab.

Torch 1.3.1 CUDA 10.1.243

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2018 NVIDIA Corporation
Built on Sat_Aug_25_21:08:01_CDT_2018
Cuda compilation tools, release 10.0, V10.0.130

I am pretty new to using a GPU for transfer learning on pytorch models. My torch.cuda.is_available() returns false and I am unabel to use a GPU. torch.backends.cudnn.enabled returns true. What might be going wrong here?

like image 570
MJ2410 Avatar asked Dec 23 '22 20:12

MJ2410


2 Answers

Make sure your Hardware accelerator is set to GPU.

Runtime > Change runtime type > Hardware Accelerator

like image 95
Kushal Arya Avatar answered Jan 01 '23 09:01

Kushal Arya


In case anyone else comes here and makes the same mistake I was making:

If you are trying to check if GPU is available and you do:

if torch.cuda.is_available:
  print('GPU available')
else:
  print('Please set GPU via Edit -> Notebook Settings.')

It will always seem that GPU is available. Note you need to use torch.cuda.is_available() not torch.cuda.is_available.

like image 23
Eric Wiener Avatar answered Jan 01 '23 09:01

Eric Wiener