Short Version:
Is there a way to instruct caret to train a regression-model
Long Version:
I have a dataframe
> feature1 <- c(1,0,0,0,1,0,0,0,1,0,0,0,1,0,0,0,1,0,0,0)
> feature2 <- c(1,0,1,1,1,0,1,1,1,0,1,1,1,0,1,1,1,0,1,1)
> feature3 <- c(0,1,1,0,0,1,1,0,0,1,1,0,0,1,1,0,0,1,1,0)
> TARGET <- factor(make.names(c(1,0,1,1,0,0,1,0,1,1,1,0,1,0,0,0,1,0,1,1)))
> df <- data.frame(feature1, feature2, feature3, TARGET)
And model training is implemented like
> ctrl <- trainControl(
+ method="repeatedcv",
+ repeats = 2)
>
> tuneGrid <- expand.grid(k = c(2,5,7))
>
> tune <- train(
+ TARGET ~ .,
+ metric = '???',
+ maximize = TRUE,
+ data = df,
+ method = "knn",
+ trControl = ctrl,
+ preProcess = c("center","scale"),
+ tuneGrid = tuneGrid
+ )
> sclasses <- predict(tune, newdata = df)
> df$PREDICTION <- make.names(factor(sclasses), unique = FALSE, allow_ = TRUE)
I want to maximize the sensitivity = precision = A / ( A + C )
Where Event
(in the image) should be in my case X1 = action taken
. But caret uses X0 = no action taken
.
I can set the positive class for my confusion matrix by using the positive
argument like
> confusionMatrix(df$PREDICTION, df$TARGET, positive = "X1")
But is there any way to set this while training (maximizing sensitivity)?
I already checked if there is another metric fitting my need, but I wasn't able to find one in the documentation. Do I have to implement my own summaryFunction
for trainControl
?
Thanks!
As far as I know, there is no direct way to specify this in the training (I have been searching for this myself for a while now). However, I found a workaround: you can just reorder the levels of the target variable in the dataframe. As the training algorithm will take the first encountered level as the positive class by default, this solves your problem. Just add this simple line of code and that does the trick:
TARGET <- factor(make.names(c(1,0,1,1,0,0,1,0,1,1,1,0,1,0,0,0,1,0,1,1)))
TARGET <- relevel(TARGET, "X1")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With