Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the mAP metric and how is it calculated? [closed]

In computer vision and object detection, the common evaluation method is mAP. What is it and how is it calculated?

like image 773
cerebrou Avatar asked Mar 29 '16 03:03

cerebrou


People also ask

What is the mAP metric?

The mean average precision (mAP) or sometimes simply just referred to as AP is a popular metric used to measure the performance of models doing document/information retrival and object detection tasks.

How is mAP detection calculated?

The mAP is calculated by finding Average Precision(AP) for each class and then average over a number of classes. The mAP incorporates the trade-off between precision and recall and considers both false positives (FP) and false negatives (FN). This property makes mAP a suitable metric for most detection applications.

How is mAP calculated machine learning?

Using different thresholds, a precision-recall curve is created. From that curve, the average precision (AP) is measured. For an object detection model, the threshold is the intersection over union (IoU) that scores the detected objects. Once the AP is measured for each class in the dataset, the mAP is calculated.

What is mAP in machine learning?

MAP involves calculating a conditional probability of observing the data given a model weighted by a prior probability or belief about the model. MAP provides an alternate probability framework to maximum likelihood estimation for machine learning.


2 Answers

mAP is Mean Average Precision.

Its use is different in the field of Information Retrieval (Reference [1] [2] )and Multi-Class classification (Object Detection) settings.

To calculate it for Object Detection, you calculate the average precision for each class in your data based on your model predictions. Average precision is related to the area under the precision-recall curve for a class. Then Taking the mean of these average individual-class-precision gives you the Mean Average Precision.

To calculate Average Precision, see [3]

like image 50
Ankitp Avatar answered Oct 20 '22 03:10

Ankitp


Quotes are from the above mentioned Zisserman paper - 4.2 Evaluation of Results (Page 11):

First an "overlap criterion" is defined as an intersection-over-union greater than 0.5. (e.g. if a predicted box satisfies this criterion with respect to a ground-truth box, it is considered a detection). Then a matching is made between the GT boxes and the predicted boxes using this "greedy" approach:

Detections output by a method were assigned to ground truth objects satisfying the overlap criterion in order ranked by the (decreasing) confidence output. Multiple detections of the same object in an image were considered false detections e.g. 5 detections of a single object counted as 1 correct detection and 4 false detections

Hence each predicted box is either True-Positive or False-Positive. Each ground-truth box is True-Positive. There are no True-Negatives.

Then the average precision is computed by averaging the precision values on the precision-recall curve where the recall is in the range [0, 0.1, ..., 1] (e.g. average of 11 precision values). To be more precise, we consider a slightly corrected PR curve, where for each curve point (p, r), if there is a different curve point (p', r') such that p' > p and r' >= r, we replace p with maximum p' of those points.

What is still unclear to me is what is done with those GT boxes that are never detected (even if the confidence is 0). This means that there are certain recall values that the precision-recall curve will never reach, and this makes the average precision computation above undefined.

Edit:

Short answer: in the region where the recall is unreachable, the precision drops to 0.

One way to explain this is to assume that when the threshold for the confidence approaches 0, an infinite number of predicted bounding boxes light up all over the image. The precision then immediately goes to 0 (since there is only a finite number of GT boxes) and the recall keeps growing on this flat curve until we reach 100%.

like image 35
Jonathan Avatar answered Oct 20 '22 05:10

Jonathan