Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Backward propagation in Keras?

Tags:

can anyone tell me how is backpropagation done in Keras? I read that it is really easy in Torch and complex in Caffe, but I can't find anything about doing it with Keras. I am implementing my own layers in Keras (A very beginner) and would like to know how to do the backward propagation.

Thank you in advance

like image 864
Tassou Avatar asked Nov 21 '17 15:11

Tassou


People also ask

Does keras do backpropagation?

Keras does backpropagation automatically. There's absolutely nothing you need to do for that except for training the model with one of the fit methods.

What is backwards propagation?

Backpropagation, or backward propagation of errors, is an algorithm that is designed to test for errors working back from output nodes to input nodes. It is an important mathematical tool for improving the accuracy of predictions in data mining and machine learning.

What is the difference between forward propagation and backward propagation?

Forward Propagation is the way to move from the Input layer (left) to the Output layer (right) in the neural network. The process of moving from the right to left i.e backward from the Output to the Input layer is called the Backward Propagation.

What is the difference between back propagation and gradient descent?

Stochastic gradient descent is an optimization algorithm for minimizing the loss of a predictive model with regard to a training dataset. Back-propagation is an automatic differentiation algorithm for calculating gradients for the weights in a neural network graph structure.


1 Answers

You simply don't. (Late edit: except when you are creating custom training loops, only for advanced uses)

Keras does backpropagation automatically. There's absolutely nothing you need to do for that except for training the model with one of the fit methods.

You just need to take care of a few things:

  • The vars you want to be updated with backpropagation (that means: the weights), must be defined in the custom layer with the self.add_weight() method inside the build method. See writing your own keras layers.
  • All calculations you're doing must use basic operators such as +, -, *, / or backend functions. By backend, tensorflow/theano/CNTK functions are also supported.

This is all you need to have the automatic backpropagation working properly.

If your layers don't have trainable weights, you don't need custom layers, create Lambda layers instead (only calculations, no trainable weights).

like image 145
Daniel Möller Avatar answered Nov 11 '22 18:11

Daniel Möller