Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What does the underscore suffix in PyTorch functions mean?

Tags:

python

pytorch

In PyTorch, many methods of a tensor exist in two versions - one with a underscore suffix, and one without. If I try them out, they seem to do the same thing:

In [1]: import torch

In [2]: a = torch.tensor([2, 4, 6])

In [3]: a.add(10)
Out[3]: tensor([12, 14, 16])

In [4]: a.add_(10)
Out[4]: tensor([12, 14, 16])

What is the difference between

  • torch.add and torch.add_
  • torch.sub and torch.sub_
  • ...and so on?
like image 671
soerface Avatar asked Oct 21 '18 21:10

soerface


1 Answers

According to the documentation, Methods which end in an underscore change the tensor in-place. That means that no new memory is being allocated by doing the operation, which in general increase performance, but can lead to problems and worse performance in PyTorch.

In [2]: a = torch.tensor([2, 4, 6])

tensor.add():

In [3]: b = a.add(10)

In [4]: a is b
Out[4]: False # b is a new tensor, new memory was allocated

tensor.add_():

In [3]: b = a.add_(10)

In [4]: a is b
Out[4]: True # Same object, no new memory was allocated

Notice, that the operators + and += are also two different implementations. + creates a new tensor by using .add(), while += modifies the tensor by using .add_()

In [2]: a = torch.tensor([2, 4, 6])

In [3]: id(a)
Out[3]: 140250660654104

In [4]: a += 10

In [5]: id(a)
Out[5]: 140250660654104 # Still the same object, no memory allocation was required

In [6]: a = a + 10

In [7]: id(a)
Out[7]: 140250649668272 # New object was created
like image 115
soerface Avatar answered Sep 28 '22 20:09

soerface