Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why neural network tends to output 'mean value'?

I am using keras to build a simple neural network for a regression task. But the output is always tends to the 'mean value' of ground truth y data. See the first figure, blue is ground truth, red is predicted value (very close to the constant mean of ground truth).

Also the model stops learning very early even though I set a learning epoch=100.

Anyone have ideas under what kinds of conditions the neural network will stop learning early and why the regression output tends to 'the mean' of ground truth?

Thanks! blue-ground truth; red-predicted value

Learning rate

like image 646
mira67 Avatar asked Oct 05 '16 00:10

mira67


People also ask

What is the output of neural network?

Computing neural network output occurs in three phases. The first phase is to deal with the raw input values. The second phase is to compute the values for the hidden-layer nodes. The third phase is to compute the values for the output-layer nodes.

Can a neural network output a number?

The answer is yes, however, the approach taken is typically different from what you have described. The branch of ML is called representation learning. In simple terms the typical concept is as follows: A vector is used to describe all concepts. Each entry of the vector corresponds to a neuron of the neural network.

What are the inputs and outputs of a neural network?

The Neural Network is constructed from 3 type of layers: Input layer — initial data for the neural network. Hidden layers — intermediate layer between input and output layer and place where all the computation is done. Output layer — produce the result for given inputs.

Do neural networks output probabilities?

In your NN, if you use a softmax output layer, you'll actually end up with an output vector of probabilities. This is actually the most common output layer to use for multi-class classification problems.


2 Answers

Possibly because the data are unpredictable....? Do you know for certain that the data set has N order predictability of some kind?

Just eyeballing your data set, it lacks periodicity, lacks homoscedasticity, it lacks any slope or skew or trend or pattern... I can't really tell if there is anything wrong with your 'net. In the absence of any pattern, the mean is always the best prediction... and it is entirely possible (although not certain) that the neural net is doing its job.

I suggest you find an easier data set, and see if you can tackle that first.

like image 100
John Wu Avatar answered Sep 28 '22 07:09

John Wu


The model is not learning from the data. Think of a basic linear regression - the 'null' prediction, the prediction if you didn't have any predictors at all, is just the expected value; i.e. the mean. It could be caused by many different issues, but initialization comes to mind - bad initialization leads to no learning. This blog post has good practical advice that may help.

like image 26
Jeremiah Johnson Avatar answered Sep 28 '22 09:09

Jeremiah Johnson