I know about two ways to exclude elements of a computation from the gradient calculation backward
Method 1: using with torch.no_grad()
with torch.no_grad(): y = reward + gamma * torch.max(net.forward(x)) loss = criterion(net.forward(torch.from_numpy(o)), y) loss.backward();
Method 2: using .detach()
y = reward + gamma * torch.max(net.forward(x)) loss = criterion(net.forward(torch.from_numpy(o)), y.detach()) loss.backward();
Is there a difference between these two? Are there benefits/downsides to either?
detach() is used to detach a tensor from the current computational graph. It returns a new tensor that doesn't require a gradient. When we don't need a tensor to be traced for the gradient computation, we detach the tensor from the current computational graph.
Copies are not created using detach, but gradients are blocked to share the data without gradients. Detach is useful when the tensor values are not needed in the computational graph.
detach() method in PyTorch is used to separate a tensor from the computational graph by returning a new tensor that doesn't require a gradient.
detach() operates on a tensor and returns the same tensor, which will be detached from the computation graph at this point, so that the backward pass will stop at this point. detach_() is the inplace operation of detach() .
tensor.detach()
creates a tensor that shares storage with tensor that does not require grad. It detaches the output from the computational graph. So no gradient will be backpropagated along this variable.
The wrapper with torch.no_grad()
temporarily set all the requires_grad
flag to false. torch.no_grad
says that no operation should build the graph.
The difference is that one refers to only a given variable on which it is called. The other affects all operations taking place within the with
statement. Also, torch.no_grad
will use less memory because it knows from the beginning that no gradients are needed so it doesn’t need to keep intermediary results.
Learn more about the differences between these along with examples from here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With