I am trying to solve a semantic segmentation problem. In accordance with the real constraints, the criteria for false positive and the criteria for false negative is different. For instance, if a pixel is miscorrected as foreground is less desirable than a pixel is miscorrected as background. How to handle this kind of constraint in setting up the loss function.
The most effective way to reduce both your false positives and negatives is using a high-quality method. This is particularly important in chromatography, though method development work is necessary in other analytical techniques.
A false positive is an outcome where the model incorrectly predicts the positive class. And a false negative is an outcome where the model incorrectly predicts the negative class.
The true positive rate (TPR, also called sensitivity) is calculated as TP/TP+FN. TPR is the probability that an actual positive will test positive. The true negative rate (also called specificity), which is the probability that an actual negative will test negative. It is calculated as TN/TN+FP.
A false positive is when a scientist determines something is true when it is actually false (also called a type I error). A false positive is a “false alarm.” A false negative is saying something is false when it is actually true (also called a type II error).
You can use the class_weight
parameter of model.fit
to weight your classes and, as such, punish misclassifications differently depending on the class.
class_weight
: optional dictionary mapping class indices (integers) to a weight (float) to apply to the model's loss for the samples from this class during training. This can be useful to tell the model to "pay more attention" to samples from an under-represented class.
For example:
out = Dense(2, activation='softmax')
model = Model(input=..., output=out)
model.fit(X, Y, class_weight={0: 1, 1: 0.5})
This would punish the second class less than the first.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With