Skip to content

Evaluation Metrics

1 mIoU(mean Intersection over Union)

Mean Intersection over Union (mIoU) represents the average intersection over union ratio between predicted pixels and ground truth pixels across all classes. IoU (Intersection over Union) is calculated by dividing the intersection area between the predicted segmentation and the ground truth by the union area between them. mIoU is computed by taking the average of IoU for each class.

2 aAcc(average accuracy)

Average accuracy (aAcc) refers to the global average accuracy obtained by dividing the number of correctly predicted pixels by the total number of pixels. It reflects the average accuracy of the model over the entire image, regardless of the differences in pixel quantities among different classes.

3 mAcc(mean accuracy)

Mean accuracy (mAcc) is the average of pixel-level accuracy for all classes. Specifically, for each class, the number of correctly predicted pixels is divided by the total number of pixels in that class to obtain the accuracy. Then, the average of accuracies for all classes is calculated. Unlike aAcc, mAcc takes into account the variation in pixel quantities among different classes and gives equal weight to the performance of each class.