Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

loss function design to incorporate different weight for false positive and false negative

I am trying to solve a semantic segmentation problem. In accordance with the real constraints, the criteria for false positive and the criteria for false negative is different. For instance, if a pixel is miscorrected as foreground is less desirable than a pixel is miscorrected as background. How to handle this kind of constraint in setting up the loss function.

like image 452
user297850 Avatar asked Feb 12 '17 18:02

user297850


People also ask

How do you reduce false positives and false negatives?

The most effective way to reduce both your false positives and negatives is using a high-quality method. This is particularly important in chromatography, though method development work is necessary in other analytical techniques.

What is a false positive and false negative and how are they significant in machine learning?

A false positive is an outcome where the model incorrectly predicts the positive class. And a false negative is an outcome where the model incorrectly predicts the negative class.

How do you calculate false positive and true negative?

The true positive rate (TPR, also called sensitivity) is calculated as TP/TP+FN. TPR is the probability that an actual positive will test positive. The true negative rate (also called specificity), which is the probability that an actual negative will test negative. It is calculated as TN/TN+FP.

What is false negative and false positive in analysis?

A false positive is when a scientist determines something is true when it is actually false (also called a type I error). A false positive is a “false alarm.” A false negative is saying something is false when it is actually true (also called a type II error).


1 Answers

You can use the class_weight parameter of model.fit to weight your classes and, as such, punish misclassifications differently depending on the class.

class_weight: optional dictionary mapping class indices (integers) to a weight (float) to apply to the model's loss for the samples from this class during training. This can be useful to tell the model to "pay more attention" to samples from an under-represented class.

For example:

out = Dense(2, activation='softmax')
model = Model(input=..., output=out)
model.fit(X, Y, class_weight={0: 1, 1: 0.5})

This would punish the second class less than the first.

like image 103
nemo Avatar answered Sep 28 '22 12:09

nemo