Model Evaluation & Metrics

Classification Metrics

5 min read

Core Metrics

Confusion Matrix:

             Predicted
           Pos    Neg
Actual Pos  TP     FN
       Neg  FP     TN

Precision = TP/(TP+FP) - Of predicted positives, how many correct? Recall = TP/(TP+FN) - Of actual positives, how many found? F1 = 2×(P×R)/(P+R) - Harmonic mean of both

Interview Q: "Spam filter - precision or recall?" A: Precision (false positives cost user trust)

Interview Q: "Fraud detection - precision or recall?" A: Recall (missing fraud is worse than false alarms)

ROC-AUC: Binary classification metric, plots TPR vs FPR. AUC=0.5 (random), AUC=1.0 (perfect).

:::