Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

pytorch grad is None after .backward()

I just installed torch-1.0.0 on Python 3.7.2 (macOS), and trying the tutorial, but the following code:

import torch
x = torch.ones(2, 2, requires_grad=True)
y = x + 2
z = y * y * 3
out = z.mean()
out.backward()
print(out.grad)

prints None which is not what's expected.

What's the problem?

like image 982
fferri Avatar asked Jan 11 '19 16:01

fferri


People also ask

What does backward do in PyTorch?

What does backward() do in PyTorch? The backward() method is used to compute the gradient during the backward pass in a neural network. The gradients are computed when this method is executed. These gradients are stored in the respective variables.

What is Requires_grad in PyTorch?

PyTorchServer Side ProgrammingProgramming. To create a tensor with gradients, we use an extra parameter "requires_grad = True" while creating a tensor. requires_grad is a flag that controls whether a tensor requires a gradient or not. Only floating point and complex dtype tensors can require gradients.

What is forward and backward in PyTorch?

The forward function computes output Tensors from input Tensors. The backward function receives the gradient of the output Tensors with respect to some scalar value, and computes the gradient of the input Tensors with respect to that same scalar value.

What is PyTorch variable?

A PyTorch Variable is a wrapper around a PyTorch Tensor, and represents a node in a computational graph. If x is a Variable then x. data is a Tensor giving its value, and x. grad is another Variable holding the gradient of x with respect to some scalar value.


2 Answers

This is the expected result.

.backward accumulate gradient only in the leaf nodes. out is not a leaf node, hence grad is None.

autograd.backward also does the same thing

autograd.grad can be used to find the gradient of any tensor w.r.t to any tensor. So if you do autograd.grad (out, out) you get (tensor(1.),) as output which is as expected.

Ref:

  • Tensor.backward (https://pytorch.org/docs/stable/autograd.html#torch.Tensor.backward)
  • autograd.backward (https://pytorch.org/docs/stable/autograd.html#torch.autograd.backward)
  • autograd.grad (https://pytorch.org/docs/stable/autograd.html#torch.autograd.grad)
like image 95
Umang Gupta Avatar answered Sep 19 '22 08:09

Umang Gupta


If you want the non-leaf gradients you can use register_hook on your non-leaf tensors in order to save them somewhere (as shown in the following answer: How to return intermideate gradients (for non-leaf nodes) in pytorch? ) .

like image 27
patapouf_ai Avatar answered Sep 21 '22 08:09

patapouf_ai