Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does nobody use Hopfield nets for MNIST?

I am currently tinkering with Hopfield nets, which is a quite interesting neuron network. I am also writing my own code for Hopfield nets to identify MNIST digits, but it doesn't work correctly, so I tried to search it online. Astonishingly, I barely found anything related to MNIST digits and Hopfield nets. Can anybody tell me the reason?

like image 485
Hanyu Guo Avatar asked Aug 16 '16 00:08

Hanyu Guo


People also ask

What is the disadvantage of Hopfield network?

One of the major shortcomings of Hopfield neural network (HNN) is that the network may not always converge to a fixed point. HNN, predominantly, is limited to local optimization during training to achieve network stability.

Is Hopfield network stable?

It is easy to show that a state transition of a Hopfield network always leads to a decrease in the energy E. Hence, for any start configuration, the network always reaches a stable state by repeated application of the state change mechanism.

What are Hopfield networks used for?

Hopfield networks can be used in data science and machine learning for a number of different tasks. For example, they can be used for data compression, optimization, and image recognition. Additionally, they can be used to solve problems such as the traveling salesman problem and the knapsack problem.

Is a Hopfield network an MLP?

Hopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is an energy-based auto-associative memory, recurrent, and biologically inspired network. It is an energy-based network since it uses energy function and minimize the energy to train the weight.


1 Answers

As far as I understand it, Hopfield networks are good for getting similar results to a given input (content-addressable memory). They are not directly applicable for classification. So you would need a classifier (e.g. an MLP / k-NN) after the Hopfield network anyway. Which is probably the reason why it isn't used.

Also, for MNIST the bar is relatively high. The lowest error is at about 0.23% (source). Publishing results which are (much) worse than state of the art is difficult and there is basically no incentive to do so. Or even to try if you're not doing something completely new / if you can't expect to be much better.

For people comming here to learn about Hopfield networks: J. J. Hopfield. Neural networks and physical systems with emergent collective computational abilities in Proceedings of the national academy of sciences, 1982.

For Germans: My German mini-summary of Hopfield neworks.

like image 161
Martin Thoma Avatar answered Sep 21 '22 13:09

Martin Thoma