Performance Indices for Binary Classification 張智星 (Roger Jang) 多媒體資訊檢索實驗室 台灣大學 資訊工程系
Confusion Matrix for Binary Classification zTerminologies used in a confusion matrix zCommonly used formulas TN (true negative) Correct rejection 00 FP (false positive) False alarm Type-1 error 01 FN (false negative) Miss Type-2 error 10 TP (true positive) Hit 11 1: positive 0: negative 1: positive Target Predicted N= TN+FP P= FN+TP
ROC Curve and AUC zROC: receiver operating characteristic yPlot of TPR vs FPR, parameterized by a threshold for the predicted class in [0, 1] zAUC: area under the curve yAUC for ROC is a commonly used performance index for binary classification xAUC=1 perfect xAUC=0.5 bad yAUC is defined clearly is the predicted class is continuous within [0, 1]. Source:
DET Curve zDET: Detection error tradeoff yPlot of FNR (miss) vs FPR (false alarm) yUp-side-down view of ROC yPreserve the same info as ROC yEasier to interpret Source:
Example of DET Curve zdetGet.m (in MLT)
Example of DET Curve (2) zdetPlot.m (in MLT)
About KWC