I need to calculate the mAP described in this question for object detection using Tensorflow.
Average precision(AP) is a typical performance measure used for ranked sets. AveragePrecision is defined as the average of the precision scores after each true positive, TP in the scope S. Given a scope S = 7,and a ranked list (gain vector) G = [1,1,0,1,1,0,0,1,1,0,1,0,0,..] where 1/0 indicate the gains associated to relevant/non-‐relevant items, respectively:
AP = (1/1 + 2/2 + 3/4 + 4/5) / 4 = 0.8875.
Mean Average Precision (mAP): average of the average precision value for a set of queries.
i got 5 One-Hot tensors with the predictions:
prediction_A
prediction_B
prediction_C
prediction_D
prediction_E
where a single prediction tensor has this structure (for example prediction_A):
00100
01000
00001
00010
00010
Then i've got the correct labels (one-hot) tensors, with the same structure:
y_A
y_B
y_C
y_D
y_E
i want compute mAP using tensorflow, cause i want summarize that, how i can do it?
i found this function but i can't use it, cause i have a multidimensional vector.
I also write a python function that compute AP but it doesn't use Tensorflow
def compute_av_precision(match_list):
n = len(match_list)
tp_counter = 0
cumulate_precision = 0
for i in range(0,n):
if match_list[i] == True:
tp_counter += 1
cumulate_precision += (float(tp_counter)/float(i+1))
if tp_counter != 0:
av_precision = cumulate_precision/float(tp_counter)
return av_precision
return 0
What is Mean Average Precision (mAP) in Object Detection? The computer vision community has converged on the metric mAP to compare the performance of object detection systems. In this post, we will dive into the intuition behind how mean Average Precision (mAP) is calculated and why mAP has become the preferred metric for object detection models.
If you really want to use a tensorflow function, there's a tensorflow function average_precision_at_k. For more info about average precision you can see this article. Show activity on this post. Thanks for contributing an answer to Stack Overflow!
From that curve, the average precision (AP) is measured. For an object detection model, the threshold is the intersection over union (IoU) that scores the detected objects.
Using different thresholds, a precision-recall curve is created. From that curve, the average precision (AP) is measured. For an object detection model, the threshold is the intersection over union (IoU) that scores the detected objects. Once the AP is measured for each class in the dataset, the mAP is calculated.
I think you may need this one:
tf.metrics.average_precision_at_k
this method takes labels and prediction to calcualte the AP@K you mentioned
below are the referenced links
https://www.tensorflow.org/api_docs/python/tf/metrics/average_precision_at_k
which implemented AP@K metric defined here:
https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval)#Average_precision
BTW, if you need a metric in Tensorflow, firstly you should search inside their official documents. Here is a list of all implemented metrics
https://www.tensorflow.org/api_docs/python/tf/metrics
cheers
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With