Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PyTorch warning about using a non-full backward hook when the forward contains multiple autograd Nodes

After a recent upgrade, when running my PyTorch loop, I now get the warning

Using a non-full backward hook when the forward contains multiple autograd Nodes`".

The training still runs and completes, but I am unsure where I am supposed to place the register_full_backward_hook function.

I have tried adding it to each of the layers in my neural network but this gives further errors about using different hooks.

Can anyone please advise?

like image 997
IllyShaieb Avatar asked Nov 14 '22 22:11

IllyShaieb


1 Answers

PyTorch version 1.8.0 deprecated register_backward_hook (source code) in favor of register_full_backward_hook (source code).

You can find it in the patch notes here: Deprecated old style nn.Module backward hooks (PR #46163)

The warning you're getting:

Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior.

Simply indicates that you should replace all register_backward_hook calls with register_full_backward_hook in your code to get the behavior described in the documentation page.

like image 52
Ivan Avatar answered Nov 24 '22 00:11

Ivan