There is an evaluation metric on sklearn, it is f1-score(Also f-beta score exists).
I know how to use it, but I could not quite understand what is stands for.
What does it indicates when it is big or small.
if we put formula aside, what should I understand from a f-score value?
F-score is a simple formula to gather the scores of precision and recall. Imagine you want to predict labels for a binary classification task (positive or negative). You have 4 types of predictions:
Precision is the proportion of true positive on all positives predictions. A precision of 1 means that you have no false positive, which is good because you never says that an element is positive whereas it is not.
Recall is the proportion of true positives on all actual positive elements. A recall of 1 means that you have no false negative, which is good because you never says an element belongs to the opposite class whereas it actually belongs to your class.
If you want to know if your predictions are good, you need these two measures. You can have a precision of 1 (so when you say it's positive, it's actutally positive) but still have a very low recall (you predicted 3 good positives but forgot 15 others). Or you can have a good recall and a bad precision.
This is why you might check f1-score, but also any other type of f-score. If one of these two values decreases dramatically, the f-score also does. But be aware that in many problems, we prefer giving more weight to precision or to recall (in web security, it is better to wrongly block some good requests than to let go some bad ones).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With