Table of confusion
Appearance
In Predictive Analytics, a Table of Confusion is a table with two rows and two columns that reports the number of True Negatives, False Positives, False Negatives, and True Positives.
Predicted Negative Predicted Positive Negative Cases True Negatives False Positives Positive Cases False Negatives True Positives
Table 1: Table of Confusion.
For example, consider a model which predicts for 10,000 Insurance Claims whether each case is Fraudulent. This model correctly predicts 9,700 non-fraudulent cases, and 100 fraudulent cases. The model also incorrectly predicts 150 cases which are not fraudulent to be fraudulent, and 50 cases which are fraudulent to be non-fraudulent. The resulting Table of Confusion is shown below.
Predicted Negative Predicted Positive Negative Cases 9,700 150 Positive Cases 50 100
Table 2: Example Table of Confusion.