I want to do some timing comparisons between CPU & GPU as well as some profiling and would like to know if there's a way to tell pytorch to not use the GPU and instead use the CPU only? I realize I could install another CPU-only pytorch, but hoping there's an easier way.
If you are tracking your models using Weights & Biases, all your system metrics, including GPU utilization, will be automatically logged. Some of the most important metrics logged are GPU memory allocated, GPU utilization, CPU utilization, etc.
I just wanted to add that it is also possible to do so within the PyTorch Code:
Here is a small example taken from the PyTorch Migration Guide for 0.4.0:
# at beginning of the script device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu") ... # then whenever you get a new Tensor or Module # this won't copy if they are already on the desired device input = data.to(device) model = MyModule(...).to(device)
I think the example is pretty self-explaining. But if there are any questions just ask!
One big advantage is when using this syntax like in the example above is, that you can create code which runs on CPU if no GPU is available but also on GPU without changing a single line.
Instead of using the if-statement with torch.cuda.is_available()
you can also just set the device to CPU like this:
device = torch.device("cpu")
Further you can create tensors on the desired device using the device
flag:
mytensor = torch.rand(5, 5, device=device)
This will create a tensor directly on the device
you specified previously.
I want to point out, that you can switch between CPU and GPU using this syntax, but also between different GPUs.
I hope this is helpful!
You can just set the CUDA_VISIBLE_DEVICES
variable to empty via shell before running your torch code.
export CUDA_VISIBLE_DEVICES=""
Should tell torch that there are no GPUs.
export CUDA_VISIBLE_DEVICES="0"
will tell it to use only one GPU (the one with id 0) and so on.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With