**1. Introduction**

mAP is a popular evaluation metric used for object detection (localisation and classification).

Object detection models such as SSD, YOLO make predictions of a bounding box and a class label.

**2. True/False Positive for bounding box**

For bounding box, we measure the overlap between the predicted bounding box and the ground truth bounding box IoU (intersection over union).

if the IoU value of prediction > IoU threshold, then we classify the prediction as True Positive (TF). On the other hand, if IoU value of prediction < IoU threshold, we classify it as False Positive (FP).True or False Positive depends on IoU threshold. In the example above if using IoU threshold=0.2 then FP wil become TP. In object detection:

**True Positive: if True Positive for both classification and bounding box**

**False Positive: if otherwise**

**3. Calculate mAP**

**3.1 mAP**

For the picture above with IoU threshold is o.5

*- Average Precision (AP) is the area under the precision-recall curve.*

*- mAP (mean average precision) is the average of AP. *

*- AP is calculated for each class and averaged to get the mAP.*

The mean Average Precision or mAP score is calculated by taking the mean AP over all classes and/or overall IoU thresholds, depending on different detection challenges.

## 3.2 Interpolated precision

The interpolated precision, p_interp, is calculated at each recall level, *r*, by taking the maximum precision measured for that *r.*

Consider a model that predicted a dataset contains 5 apples. We collect all the predictions made for apples in all the images and rank it in descending order according to the predicted confidence level.

The Precision at Rank 4th = (1 + 1) / (1 + 1 + 1+ 1) = 0.5 The Recal at Rank 4th = (1 + 1) / 5= 0.4

we smooth out the zigzag pattern: The orange line is transformed into the green lines. We replaced the precision value for recall *ȓ* with the maximum precision for any recall ≥ *ȓ*. We replaced all precision in [0.4:0.8] by max precision at 0.8

Pascal VOC2008 used the 11-point interpolated AP, we divide the recall value from 0 to 1.0 into 11 points — 0, 0.1, 0.2, …, 0.9 and 1.0.

AP = (1/11) * (1+1+1+1+1 +0.57+0.57+0.57+0.57+0.5+0.5)

COCO used a 101-point interpolated AP

AP75 is AP@.75 means the AP with IoU threshold=0.75

AP50 is AP@.50 means the AP with IoU threshold=5

## No comments:

## Post a Comment