Where can I find the backpropagation (through time) code in Tensorflow (python API)? Or are other algorithms used?
For example, when I create a LSTM net.
All backpropagation in TensorFlow is implemented by automatically differentiating the operations in the forward pass of the network, and adding explicit operations for computing the gradient at each point in the network. The general implementation can be found in tf.gradients()
, but the particular version used depends on how your LSTM is implemented:
tf.gradients()
to build an unrolled backpropagation loop in the opposite direction.tf.while_loop()
, it uses additional support for differentiating loops in control_flow_grad.py
.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With