Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Are there alternatives to backpropagation?

I know a neural network can be trained using gradient descent and I understand how it works.

Recently, I stumbled upon other training algorithms: conjugate gradient and quasi-Newton algorithms. I tried to understand how they work but the only good intuition I could get is that they use higher order derivative.

My questions are the following: are those alternative algorithms I mentioned fundamentally different from a backpropagation process where weights are adjusted by using the gradient of the loss function? If not, is there an algorithm to train a neural network that is fundamentally different from the mechanism of backpropagation?

Thanks

like image 561
Nope Avatar asked Mar 21 '19 18:03

Nope


2 Answers

Conjugate gradient and quasi-Newton algorithms are still gradient descent algorithms. Backpropagation (or backprop) is nothing more than a fancy name to a gradient computation.

However, the original question of alternatives to backprop is very important. One of the recent alternatives, for example, is equilibrium propagation (or shortly eqprop).

like image 126
penkovsky Avatar answered Sep 22 '22 16:09

penkovsky


Neuroevolution of augmenting topologies or NEAT is another way to learn the topology of the network and weights/biases of the network using the genetic algorithm.

like image 37
Frank Liu Avatar answered Sep 19 '22 16:09

Frank Liu