I just installed torch-1.0.0 on Python 3.7.2 (macOS), and trying the tutorial, but the following code:
import torch
x = torch.ones(2, 2, requires_grad=True)
y = x + 2
z = y * y * 3
out = z.mean()
out.backward()
print(out.grad)
prints None
which is not what's expected.
What's the problem?
What does backward() do in PyTorch? The backward() method is used to compute the gradient during the backward pass in a neural network. The gradients are computed when this method is executed. These gradients are stored in the respective variables.
PyTorchServer Side ProgrammingProgramming. To create a tensor with gradients, we use an extra parameter "requires_grad = True" while creating a tensor. requires_grad is a flag that controls whether a tensor requires a gradient or not. Only floating point and complex dtype tensors can require gradients.
The forward function computes output Tensors from input Tensors. The backward function receives the gradient of the output Tensors with respect to some scalar value, and computes the gradient of the input Tensors with respect to that same scalar value.
A PyTorch Variable is a wrapper around a PyTorch Tensor, and represents a node in a computational graph. If x is a Variable then x. data is a Tensor giving its value, and x. grad is another Variable holding the gradient of x with respect to some scalar value.
This is the expected result.
.backward
accumulate gradient only in the leaf nodes. out
is not a leaf node, hence grad is None.
autograd.backward
also does the same thing
autograd.grad
can be used to find the gradient of any tensor w.r.t to any tensor. So if you do autograd.grad (out, out)
you get (tensor(1.),)
as output which is as expected.
Ref:
If you want the non-leaf gradients you can use register_hook
on your non-leaf tensors in order to save them somewhere (as shown in the following answer: How to return intermideate gradients (for non-leaf nodes) in pytorch? ) .
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With