ruk·si

Confusion Matrix

Updated at 2018-07-06 22:41

Confusion matrix is a visualization of the performance of a machine learning algorithm.

Actual 0Actual 1
Predicted 0True NegativeFalse Negative
Predicted 1False PositiveTrue Positive
Precision = TP / (TP+FP)
Recall    = TP / (TP+FN)
Accuracy  = (TP+TN) / (TP+TN+FP+FN)
F1 Score  = (2 * Precision * Recall) / (Precision + Recall)

Precision describes how confident we are with positive predictions.

Recall describes how big portion of our positive predictions cover all actual positives.

For example:

sample of 27 animals — 8 cats, 6 dogs, and 13 rabbits
Actual CatActual DogActual Rabbit
Predicted Cat520
Predicted Dog332
Predicted Rabbit0111

Table of confusion for cat class:

Actual CatActual Non-cat
Predicted Cat5 TP2 FP (2+0)
Predicted Non-cat3 FN (3+0)17 TN (3+2+1+11)

Table of confusion for dog class:

Actual DogActual Non-dog
Predicted Dog3 TP5 FP (3+2)
Predicted Non-dog3 FN (2+1)16 TN (5+11)

Table of confusion for rabbit class:

Actual RabbitActual Non-rabbit
Predicted Rabbit11 TP1 FP (1+0)
Predicted Non-rabbit2 FN (2+0)13 TN (5+2+3+3)

The final table of confusion would contain the average values for all classes combined:

Actual 1Actual 0
Predicted 16.33 TP2.66 FP
Predicted 02.66 FN15.33 TN
Precision = 6.33 / (6.33 + 2.66)                           = 0.70
Recall    = 6.33 / (6.33 + 2.66)                           = 0.70
Accuracy  = (6.33 + 15.33) / (6.33 + 15.33 + 2.66 + 2.66)  = 0.80
F1 Score  = (2 * 0.70 * 0.70) / (0.70 + 0.70)              = 0.70

Sources