Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

tensorflow stop_gradient equivalent in pytorch

What is the tf.stop_gradient() equivalent (provides a way to not compute gradient with respect to some variables during back-propagation) in pytorch?

like image 203
aerin Avatar asked Jul 26 '18 02:07

aerin


People also ask

What is Requires_grad in Pytorch?

Tensor. requires_grad. Is True if gradients need to be computed for this Tensor, False otherwise. The fact that gradients need to be computed for a Tensor do not mean that the grad attribute will be populated, see is_leaf for more details.

What is TF Stop_gradient?

tf. stop_gradient provides a way to not compute gradient with respect to some variables during back-propagation.

What is detach in Pytorch?

detach() is used to detach a tensor from the current computational graph. It returns a new tensor that doesn't require a gradient. When we don't need a tensor to be traced for the gradient computation, we detach the tensor from the current computational graph.

How do you stop gradient flow Pytorch?

If you know during the forward which part you want to block the gradients from, you can use . detach() on the output of this block to exclude it from the backward.


2 Answers

Could you check with x.detach().

like image 122
Deepali Avatar answered Oct 13 '22 20:10

Deepali


Tensors in pytorch have requires_grad attribute. Set it to False to prevent gradient computation for that tensors.

like image 23
Shai Avatar answered Oct 13 '22 22:10

Shai