🪴 Aradinka Digital Garden

Search

Search IconIcon to open search

confusion-matrix

Last updated Dec 1, 2022

# Summary

A good model has high TP and TN, while low FP(error 1 $\alpha$) and FN(error 2 $\theta$). Or F1 score has high value

Accuracy:

Precision:

Recall / sensitivity:

Specificity:

F1 Score:

# What is confusion matrix?

Confusion matrix is a table layout that allows visualization of the performance of an algorithm. It is an $N \times N$ matrix used for evaluating the performance of a classification model, where $N$ is the number of target classes. The matrix compare the actual target values with those predicted by the model. It’s a tabular summary of the number of correct and incorrect predictions made by a classifier.

Each row represents the instances in an actual class, while each column represents the instances in a predicted class. It is a special kind of contingency-table.

In unsupervised learning, it usually called a matching-matrix

Key takeaway:

# Terminology

P (Condiiton Positive)

N (Condition Negative)

TP (True Positive)

TN (True Negative)

FP (False Positive) / Type 1 error

FN (False Negative) / Type 2 error

# Why we’re not using only accuracy?

In classification accuracy, there is no information about the number of misclassified instances. It’s not suited for imbalanced classes

# Classification Measure

Accuracy