Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Neural Networks: Does the input layer consist of neurons?

I currently study the Neural Networks theory and I see that everywhere it is written that it consists of the following layers:

  • Input Layer
  • Hidden Layer(s)
  • Output Layer

I see some graphical descriptions that show the input layer as real nodes in the net, while others show this layer as just a vector of values [x1, x2, ... xn]

What is the correct structure?

Is the "input layer" a real layer of neurons? Or is this just abstractly named as layer, while it really is just the input vector?

Adding contradicting and confusing photos I found in the web:

Here it looks like the input layer consists of neurons: Screenshot

Here it looks like the input layer is just an input vector: Screenshot

like image 916
SomethingSomething Avatar asked Feb 02 '15 23:02

SomethingSomething


1 Answers

Is the "input layer" a real layer of neurons? Or is this just abstractly named as layer, while it really is just the input vector?

Yes, it's both - depending on the abstraction. On paper the network has input neurons. On implementation level you have to organize this data (usually using arrays/vectors) which is why you speak of an input vector:

An input vector holds the input neuron values (representing the input layer).

If you're familiar with basics of graph theory or image processing - it's the same principle. For example, you can call an image a matrix (technical view) or a field of pixels (more abstract view).

like image 93
runDOSrun Avatar answered Jan 02 '23 19:01

runDOSrun