Library: Keras, backend:Tensorflow
I am training a single class/binary classification problem, wherein my final layer has a single node, with activation of sigmoid type. I am compiling my model with a binary cross entropy loss. When I run the code to train my model, I notice that the loss is a value greater than 1. Is that right, or am I going wrong somewhere? I have checked the labels. They're all 0s and 1s.
Is it possible to have the binary cross entropy loss greater than 1?
Yes. It can happen, as explained above. (However, this does not mean that you do not have mistakes in your code.)
One might wonder, what is a good value for cross entropy loss, how do I know if my training loss is good or bad? Some intuitive guidelines from MachineLearningMastery post for natural log based for a mean loss: Cross-Entropy = 0.00: Perfect probabilities. Cross-Entropy < 0.02: Great probabilities.
The Binary Cross entropy will calculate the cross-entropy loss between the predicted classes and the true classes. By default, the sum_over_batch_size reduction is used. This means that the loss will return the average of the per-sample losses in the batch.
Binary Cross Entropy is a special case of Categorical Cross Entropy with 2 classes (class=1, and class=0). If we formulate Binary Cross Entropy this way, then we can use the general Cross-Entropy loss formula here: Sum(y*log y) for each class. Notice how this is the same as binary cross entropy.
Keras binary_crossentropy
first convert your predicted probability to logits. Then it uses tf.nn.sigmoid_cross_entropy_with_logits
to calculate cross entropy and return to you the mean of that. Mathematically speaking, if your label is 1 and your predicted probability is low (like 0.1), the cross entropy can be greater than 1, like losses.binary_crossentropy(tf.constant([1.]), tf.constant([0.1]))
.
Yes, its correct, the cross-entropy is not bound in any specific range, its just positive (> 0).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With