Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tuning XGboost parameters In R

Tags:

r

xgboost

r-caret

I am trying to tune parameters using the caret package in R but get a

Error in train.default(x = as.matrix(df_train %>% select(-c(Response,  : 
  The tuning parameter grid should have columns nrounds, lambda, alpha 

whenever I try to train the model, even though the columns nrounds, lambda, and alpha are there.

library(caret)
library(xgboost)
library(readr)
library(dplyr)
library(tidyr)

 xgb_grid_1 <- expand.grid(
  nrounds= 2400,
  eta=c(0.01,0.001,0.0001),
  lambda = 1,
  alpha =0
)

xgb_trcontrol <- trainControl(
  method="cv",
  number = 5,
  verboseIter = TRUE,
  returnData=FALSE,
  returnResamp = "all",
  allowParallel = TRUE,

)

xgb_train_1 <- train(
  x = as.matrix(df_train %>% select(-c(Response, Id))),
  y= df_train$Response,
 trControl = xgb_trcontrol,
 tuneGrid = xgb_grid_1,
 method="xgbLinear"
)
like image 902
AppleGate0 Avatar asked Nov 27 '15 03:11

AppleGate0


People also ask

Does XGBoost require tuning?

Building a model using XGBoost is easy. But, improving the model using XGBoost is difficult (at least I struggled a lot). This algorithm uses multiple parameters. To improve the model, parameter tuning is must.

What is ETA in XGBoost R?

eta: Learning (or shrinkage) parameter. It controls how much information from a new tree will be used in the Boosting. This parameter must be bigger than 0 and limited to 1. If it is close to zero we will use only a small piece of information from each new tree.


1 Answers

The problem lies in your xgb_grid_1. If you remove the line eta it will work.

Standard tuning options with xgboost and caret are "nrounds", "lambda" and "alpha". Not eta. use the modelLookup function to see which model parameters are available. If you want to use eta as well, you will have to create your own caret model to use this extra parameter in tuning as well.

modelLookup("xgbLinear")
      model parameter                 label forReg forClass probModel
1 xgbLinear   nrounds # Boosting Iterations   TRUE     TRUE      TRUE
2 xgbLinear    lambda     L2 Regularization   TRUE     TRUE      TRUE
3 xgbLinear     alpha     L2 Regularization   TRUE     TRUE      TRUE
like image 199
phiver Avatar answered Oct 24 '22 12:10

phiver