Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras Tensorflow Binary Cross entropy loss greater than 1

Library: Keras, backend:Tensorflow

I am training a single class/binary classification problem, wherein my final layer has a single node, with activation of sigmoid type. I am compiling my model with a binary cross entropy loss. When I run the code to train my model, I notice that the loss is a value greater than 1. Is that right, or am I going wrong somewhere? I have checked the labels. They're all 0s and 1s.

Is it possible to have the binary cross entropy loss greater than 1?

like image 353
Blue Avatar asked Apr 17 '18 15:04

Blue


People also ask

Can binary cross-entropy loss be greater than 1?

Yes. It can happen, as explained above. (However, this does not mean that you do not have mistakes in your code.)

What is a good binary cross-entropy loss value?

One might wonder, what is a good value for cross entropy loss, how do I know if my training loss is good or bad? Some intuitive guidelines from MachineLearningMastery post for natural log based for a mean loss: Cross-Entropy = 0.00: Perfect probabilities. Cross-Entropy < 0.02: Great probabilities.

What is binary cross-entropy loss in keras?

The Binary Cross entropy will calculate the cross-entropy loss between the predicted classes and the true classes. By default, the sum_over_batch_size reduction is used. This means that the loss will return the average of the per-sample losses in the batch.

Can I use cross-entropy loss for binary classification?

Binary Cross Entropy is a special case of Categorical Cross Entropy with 2 classes (class=1, and class=0). If we formulate Binary Cross Entropy this way, then we can use the general Cross-Entropy loss formula here: Sum(y*log y) for each class. Notice how this is the same as binary cross entropy.


2 Answers

Keras binary_crossentropy first convert your predicted probability to logits. Then it uses tf.nn.sigmoid_cross_entropy_with_logits to calculate cross entropy and return to you the mean of that. Mathematically speaking, if your label is 1 and your predicted probability is low (like 0.1), the cross entropy can be greater than 1, like losses.binary_crossentropy(tf.constant([1.]), tf.constant([0.1])).

like image 73
Y. Luo Avatar answered Oct 19 '22 22:10

Y. Luo


Yes, its correct, the cross-entropy is not bound in any specific range, its just positive (> 0).

like image 25
Dr. Snoopy Avatar answered Oct 19 '22 23:10

Dr. Snoopy