I am currently tinkering with Hopfield nets, which is a quite interesting neuron network. I am also writing my own code for Hopfield nets to identify MNIST digits, but it doesn't work correctly, so I tried to search it online. Astonishingly, I barely found anything related to MNIST digits and Hopfield nets. Can anybody tell me the reason?
One of the major shortcomings of Hopfield neural network (HNN) is that the network may not always converge to a fixed point. HNN, predominantly, is limited to local optimization during training to achieve network stability.
It is easy to show that a state transition of a Hopfield network always leads to a decrease in the energy E. Hence, for any start configuration, the network always reaches a stable state by repeated application of the state change mechanism.
Hopfield networks can be used in data science and machine learning for a number of different tasks. For example, they can be used for data compression, optimization, and image recognition. Additionally, they can be used to solve problems such as the traveling salesman problem and the knapsack problem.
Hopfield Network is the predecessor of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). It is an energy-based auto-associative memory, recurrent, and biologically inspired network. It is an energy-based network since it uses energy function and minimize the energy to train the weight.
As far as I understand it, Hopfield networks are good for getting similar results to a given input (content-addressable memory). They are not directly applicable for classification. So you would need a classifier (e.g. an MLP / k-NN) after the Hopfield network anyway. Which is probably the reason why it isn't used.
Also, for MNIST the bar is relatively high. The lowest error is at about 0.23% (source). Publishing results which are (much) worse than state of the art is difficult and there is basically no incentive to do so. Or even to try if you're not doing something completely new / if you can't expect to be much better.
For people comming here to learn about Hopfield networks: J. J. Hopfield. Neural networks and physical systems with emergent collective computational abilities in Proceedings of the national academy of sciences, 1982.
For Germans: My German mini-summary of Hopfield neworks.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With