Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Compute precision and accuracy using numpy

Tags:

python

numpy

Suppose two lists true_values = [1, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 0] and predictions = [1, 1, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0]. How can I compute the accuracy and the precision using numpy?

enter image description here

like image 600
David Avatar asked Feb 16 '26 15:02

David


2 Answers

import numpy as np

true_values = np.array([[1, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 0]])
predictions = np.array([[1, 1, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0]])

N = true_values.shape[1]
accuracy = (true_values == predictions).sum() / N
TP = ((predictions == 1) & (true_values == 1)).sum()
FP = ((predictions == 1) & (true_values == 0)).sum()
precision = TP / (TP+FP)

This is the most concise way I came up with (assuming no sklearn), there might be even shorter though!

like image 197
ymentha14 Avatar answered Feb 18 '26 03:02

ymentha14


This is what sklearn, which uses numpy behind the curtain, is for:

from sklearn.metrics import precision_score, accuracy_score

accuracy_score(true_values, predictions), precision_score(true_values, predictions)

Output:

(0.3333333333333333, 0.375)
like image 44
Quang Hoang Avatar answered Feb 18 '26 04:02

Quang Hoang