Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the difference between the train loss and train error?

A professor of mine gave me a small script that he uses to visualize the evolution of his neural net after every epoch of learning. This is a plot of 3 values: train loss, train error and test error.

What is the difference between the first two?

like image 222
Alperen AYDIN Avatar asked Jul 11 '16 09:07

Alperen AYDIN


2 Answers

Train Loss: Value of the objective function you are minimizing. This value could be a positive or negative number, depending on the specific objective function.

Train Error: Human interpretable metric of your model's performance. Usually means what % of training examples the model got incorrect. This is always a value between 0 and 1.

like image 62
ahaque Avatar answered Dec 01 '22 16:12

ahaque


To understand the difference between error and loss you need to understand how your neural network is learning. In order to be able to learn there must exist a continous differentiable loss function in order to use back propagation algorithm. So the loss value is a value of this function. Sometimes this loss is exactly what you want to minimize (e.g. the distance between your model and true function in a regression case) but sometimes your error measure is not continous or it is impossible to differentiate it and then you have to introduce yet another different loss function. A good example of this is a binary classification task where accuracy error is not differentiable. You usually use a crossentropy or Hinge loss in order to improve accuracy. In this case your error will be 1 - accuracy and loss will be a value of e.g. crossentropy.

like image 28
Marcin Możejko Avatar answered Dec 01 '22 15:12

Marcin Możejko