Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow: Interpretation of Weight in Weighted Cross Entropy

Tags:

tensorflow

The Tensorflow function tf.nn.weighted_cross_entropy_with_logits() takes the argument pos_weight. The documentation defines pos_weight as "A coefficient to use on the positive examples." I assume this means that increasing pos_weight increases the loss from false positives and decreases the loss from false negatives. Or do I have that backwards?

like image 699
Ron Cohen Avatar asked Nov 19 '16 22:11

Ron Cohen


1 Answers

Actually, it's the other way around. Citing documentation:

The argument pos_weight is used as a multiplier for the positive targets.

So, assuming you have 5 positive examples in your dataset and 7 negative, if you set the pos_weight=2, then your loss would be as if you had 10 positive examples and 7 negative.

Assume you got all of the positive examples wrong and all negative right. Originally you would have 5 false negatives and 0 false positives. When you increase the pos_weight, the number of false negatives will artificially increase. Note that the loss value coming from false positives doesn't change.

like image 153
sygi Avatar answered Nov 02 '22 10:11

sygi