*Confusion Matrix*

*Confusion Matrix*

Updated at 2018-07-07 01:41

Confusion matrix is a visualization of the performance of a machine learning algorithm.

Actual 0 | Actual 1 | |
---|---|---|

Predicted 0 | True Negative | False Negative |

Predicted 1 | False Positive | True Positive |

```
Precision = TP / (TP+FP)
Recall = TP / (TP+FN)
Accuracy = (TP+TN) / (TP+TN+FP+FN)
F1 Score = (2 * Precision * Recall) / (Precision + Recall)
```

**Precision** describes how confident we are with positive predictions.

**Recall** describes how big portion of our positive predictions cover all actual positives.

For example:

```
sample of 27 animals — 8 cats, 6 dogs, and 13 rabbits
```

Actual Cat | Actual Dog | Actual Rabbit | |
---|---|---|---|

Predicted Cat | 5 | 2 | 0 |

Predicted Dog | 3 | 3 | 2 |

Predicted Rabbit | 0 | 1 | 11 |

Table of confusion for cat class:

Actual Cat | Actual Non-cat | |
---|---|---|

Predicted Cat | 5 TP | 2 FP (2+0) |

Predicted Non-cat | 3 FN (3+0) | 17 TN (3+2+1+11) |

Table of confusion for dog class:

Actual Dog | Actual Non-dog | |
---|---|---|

Predicted Dog | 3 TP | 5 FP (3+2) |

Predicted Non-dog | 3 FN (2+1) | 16 TN (5+11) |

Table of confusion for rabbit class:

Actual Rabbit | Actual Non-rabbit | |
---|---|---|

Predicted Rabbit | 11 TP | 1 FP (1+0) |

Predicted Non-rabbit | 2 FN (2+0) | 13 TN (5+2+3+3) |

The final table of confusion would contain the average values for all classes combined:

Actual 1 | Actual 0 | |
---|---|---|

Predicted 1 | 6.33 TP | 2.66 FP |

Predicted 0 | 2.66 FN | 15.33 TN |

```
Precision = 6.33 / (6.33 + 2.66) = 0.70
Recall = 6.33 / (6.33 + 2.66) = 0.70
Accuracy = (6.33 + 15.33) / (6.33 + 15.33 + 2.66 + 2.66) = 0.80
F1 Score = (2 * 0.70 * 0.70) / (0.70 + 0.70) = 0.70
```