Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Delta rule vs. gradient descent?

What's the difference between gradient descent and the delta rule?

like image 347
user602774 Avatar asked Feb 04 '11 07:02

user602774


1 Answers

Without math: The delta rule uses gradient descent to minimize the error from a perceptron network's weights.

Gradient descent is a general algorithm that gradually changes a vector of parameters in order to minimize an objective function. It does this by moving in the direction of least resistance, i.e. the direction that has the largest (negative) gradient.

You find this direction by taking the derivative of the objective function. It's like dropping a marble in a smooth hilly landscape. It guaranties a local minimum only. So, the short answer is that the delta rule is a specific algorithm using the general algorithm gradient descent.

like image 148
André C. Andersen Avatar answered Nov 02 '22 19:11

André C. Andersen