Performance Metrics for Binary Classification Problems Cheatsheet

This article is merely for a quick recap of Machine Learning Knowledge, which can not be served as a tutorial.
All rights reserved by Diane(Qingyun Hu).

Prerequisites

TP: True Positive
FP: False Positive
TN: True Negative
FN: False Negative

Recall

= Sensitivity = TPR(True Positive Rate)
\begin{equation}
Recall = \frac{TP} {TP + FN}
\end{equation}

Precision

\begin{equation}
Precision = \frac{TP} {TP + FP}
\end{equation}

Accuracy

\begin{equation}
Accuracy = \frac{TP + TN} {TP + FP +TN + FN}
\end{equation}

F1 Score

\begin{equation}
F1\ Score = \frac{2 * Recall * Precision} {Recall + Precision}
\end{equation}

Specificity

\begin{equation}
Specificity = \frac{TN} {TN + FP}
\end{equation}

FPR(False Positive Rate)

= 1 - Specificity
\begin{equation}
FPR = \frac{FP} {TN + FP}
\end{equation}

ROC Curve

x-axis: FPR ( = 1 - Specificity )
y-axis: TPR ( = Recall )

AUC (Area under the ROC Curve)

The bigger the size of AUC is, the better.

posted @ 2019-08-02 13:48  DianeSoHungry  阅读(174)  评论(0编辑  收藏  举报