Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to interpret increase in both loss and accuracy

I have run deep learning models(CNN's) using tensorflow. Many times during the epoch, i have observed that both loss and accuracy have increased, or both have decreased. My understanding was that both are always inversely related. What could be scenario where both increase or decrease simultaneously.

like image 547
Nitin Avatar asked Dec 01 '16 12:12

Nitin


People also ask

How do you interpret accuracy and loss?

Higher loss is the worse(bad prediction) for any model. The loss is calculated on training and validation and its interpretation is how well the model is doing for these two sets. Unlike accuracy, a loss is not a percentage. It is a sum of the errors made for each example in training or validation sets.

Does lower loss indicate higher accuracy?

It measures how well (or bad) our model is doing. If the errors are high, the loss will be high, which means that the model does not do a good job. Otherwise, the lower it is, the better our model works. To calculate the loss, a loss or cost function is used.

How do you interpret accuracy in machine learning?

Accuracy is defined as the percentage of correct predictions for the test data. It can be calculated easily by dividing the number of correct predictions by the number of total predictions.

How do you increase validation accuracy and decrease validation loss?

You can try adding dropout layers or batch-normalization layers, adding weight regularization, or you can artificially increase the size of your training set by performing some data augmentation.


2 Answers

The loss decreases as the training process goes on, except for some fluctuation introduced by the mini-batch gradient descent and/or regularization techniques like dropout (that introduces random noise).

If the loss decreases, the training process is going well.

The (validation I suppose) accuracy, instead, it's a measure of how good the predictions of your model are.

If the model is learning, the accuracy increases. If the model is overfitting, instead, the accuracy stops to increase and can even start to decrease.

If the loss decreases and the accuracy decreases, your model is overfitting.

If the loss increases and the accuracy increase too is because your regularization techniques are working well and you're fighting the overfitting problem. This is true only if the loss, then, starts to decrease whilst the accuracy continues to increase. Otherwise, if the loss keep growing your model is diverging and you should look for the cause (usually you're using a too high learning rate value).

like image 130
nessuno Avatar answered Sep 28 '22 17:09

nessuno


I think the top-rated answer is incorrect.

I will assume you are talking about cross-entropy loss, which can be thought of as a measure of 'surprise'.

Loss and accuracy increasing/decreasing simultaneously on the training data tells you nothing about whether your model is overfitting. This can only be determined by comparing loss/accuracy on the validation vs. training data.

If loss and accuracy are both decreasing, it means your model is becoming more confident on its correct predictions, or less confident on its incorrect predictions, or both, hence decreased loss. However, it is also making more incorrect predictions overall, hence the drop in accuracy. Vice versa if both are increasing. That is all we can say.

like image 37
nlml Avatar answered Sep 28 '22 16:09

nlml