Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to free gpu memory by deleting tensors?

Tags:

python

pytorch

Suppose I create a tensor and put it on the GPU and don't need it later and want to free the GPU memory allocated to it; How do I do it?

import torch
a=torch.randn(3,4).cuda() # nvidia-smi shows that some mem has been allocated.
# do something
# a does not exist and nvidia-smi shows that mem has been freed.

I have tried:

  1. del a
  2. del a; torch.cuda.empty_cache()

But none of them work.

like image 780
Linghao.Chen Avatar asked Apr 22 '19 01:04

Linghao.Chen


1 Answers

Running del tensor frees the memory from the GPU but does not return it to the device which is why the memory still being shown as used on nvidia-smi. You can create a new tensor and that would reuse that memory.

Sources

https://discuss.pytorch.org/t/how-to-delete-pytorch-objects-correctly-from-memory/947 https://discuss.pytorch.org/t/about-torch-cuda-empty-cache/34232

like image 55
Haran Rajkumar Avatar answered Oct 24 '22 09:10

Haran Rajkumar