Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Calculating precision, recall and FScore from the results of a confusion matrix in R

I have got th following confusion matrix, now I need to calculate the precision, recall and FScore from it, how do I do that using the obtained values? Confusion Matrix and Statistics

      Reference
Prediction One Zero
      One   37   43
      Zero  19  131

               Accuracy : 0.7304          
                 95% CI : (0.6682, 0.7866)
    No Information Rate : 0.7565          
    P-Value [Acc > NIR] : 0.841087        

                  Kappa : 0.3611          
 Mcnemar's Test P-Value : 0.003489        

            Sensitivity : 0.6607          
            Specificity : 0.7529          
         Pos Pred Value : 0.4625          
         Neg Pred Value : 0.8733          
             Prevalence : 0.2435          
         Detection Rate : 0.1609          
   Detection Prevalence : 0.3478          
      Balanced Accuracy : 0.7068          

       'Positive' Class : One

I've used the following edited code after suggestions from other users

library(class)
library(e1071)
library(caret)
library(party)
library(nnet)
library(forecast)
pimad <- read.csv("C:/Users/USER/Desktop/AMAN/pimad.csv")
nrow(pimad)  
set.seed(9850)
gp<-runif(nrow(pimad))
pimad<-pimad[order(gp),]
idx <- createDataPartition(y = pimad$class, p = 0.7, list = FALSE)
train<-pimad[idx,]
test<-pimad[-idx,]
svmmodel<-svm(class~.,train,kernel="radial")
psvm<-predict(svmmodel,test)
table(psvm,test$class)
library(sos)
findFn("confusion matrix precision recall FScore")
df<-(confusionMatrix(test$class, psvm))
dim(df)
df[1,2]/sum(df[1,2:3])
df
like image 693
amankedia Avatar asked Nov 07 '15 11:11

amankedia


People also ask

How do you calculate recall from confusion matrix?

In an imbalanced classification problem with two classes, recall is calculated as the number of true positives divided by the total number of true positives and false negatives. The result is a value between 0.0 for no recall and 1.0 for full or perfect recall.

How do you calculate F1 in R?

In this approach to calculate the F1 score, the user needs to first install and import the caret package in the working R console, and then further the user needs to call the confusionMatrix() function and pass the required parameter into it.


1 Answers

Nothing else you need to do, you've got all the requested measures in df. Just type:

ls(df) [1] "byClass" "dots" "mode" "overall" "positive" "table"

df$byClass # This is another example I've worked on

Now all the parameters including sensitivity, specificity, pos pred val, neg pred val, precision, recall, F1, prevalence, detection rate, detection prevalence and balanced accuracy appears in a table

like image 106
r. ahmadi Avatar answered Sep 21 '22 18:09

r. ahmadi