Skip to content

Confusion Matrix: TP, TN, FP, FN #37

@sophryu99

Description

@sophryu99

Confusion Matrix

A confusion matrix describes the performance of the classification model. In other words, confusion matrix is a way to summarize classifier performance. The following figure shows a basic representation of a confusion matrix:

Screen Shot 2022-01-27 at 1 54 21 PM

It’s based on the fact that we need to compare the class predicted by the classifier with the actual class for each observation.
  • True Negative: The predicted class is negative and coincides with the actual case, which is negative too. In the example, we predicted the class 0 and the actual class of that observation is 0! This is a correct prediction.
  • True Positives: The predicted class is the same as the actual class. In the example, the predicted value is 1 and coincides with the actual class of that particular observation.
  • False Negatives: the predicted class is negative but doesn’t coincide with the actual class, which is instead positive. In the example, the predicted value is 0, but the actual class of that observation is 1! So, the prediction is wrong.
  • False Positives: The predicted class is positive, but the actual class is negative. In the example, the predicted class is 1 and the actual class of that observation is 0. The prediction is again wrong!

If the predicted val and the actual value are the same, it is True, else False
If the predicted val is 1, it is Positive, else Negative

>>> tn, fp, fn, tp = confusion_matrix([0, 1, 0, 1], [1, 1, 1, 0]).ravel()
>>> (tn, fp, fn, tp)
(0, 2, 1, 1)

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions