I am quite new to artificial neural network, and what I cannot understand is why we need the concept of layer. Isn't is enough to connect each neuron to some other neurons creating a kind of web more then a layered based structure?
For example for solving the XOR we usually need at least 3 layers, 1 input with 2 neurons, 1+ hidden layer(s) with some neurons and 1 output layer with 1 neuron.
Couldn't we create a network with 2 input neurons (we need them) and 1 output connected by a web of other neurons?
Example of what I mean
The term 'layer' is different than you might think. There is always a 'web' of neurons. A layer just denotes a group of neurons.
If I want to connect layer X with layer Y, this means I am connecting all neurons from layer X to all neurons from layer Y. But not always! You could also connect each neuron from layer X to just one neuron in layer Y. There are lots of different connection techniques.
But layers aren't required! It just makes the coding (and explanation) process a whole lot easier. Instead of connecting all neurons one by one, you can connect them in layers. It's far easier to say "layer A and B are connected" than "neuron 1,2,3,4,5 are all connected with neurons 6,7,8,9".
If you are interested in 'layerless' networks, please take a look at Liquid State Machines:
(the neurons might look to be layered, but they aren't!)
PS: I develop a Javascript neural network library, and I have created an onlinedemo in which a neural network evolves to an XOR gate - without layers, just starting with input and output. View it here.. Your example picture is exactly what kind of networks you could develop with this library.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With