Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Understanding num_classes for xgboost in R

Tags:

r

xgboost

I'm having a lot of trouble figuring out how to correctly set the num_classes for xgboost.

I've got an example using the Iris data

df <- iris

y <- df$Species
num.class = length(levels(y))
levels(y) = 1:num.class
head(y)

df <- df[,1:4]

y <- as.matrix(y)
df <- as.matrix(df)

param <- list("objective" = "multi:softprob",    
          "num_class" = 3,    
          "eval_metric" = "mlogloss",    
          "nthread" = 8,   
          "max_depth" = 16,   
          "eta" = 0.3,    
          "gamma" = 0,    
          "subsample" = 1,   
          "colsample_bytree" = 1,  
          "min_child_weight" = 12)

model <- xgboost(param=param, data=df, label=y, nrounds=20)

This returns an error

Error in xgb.iter.update(bst$handle, dtrain, i - 1, obj) : 
SoftmaxMultiClassObj: label must be in [0, num_class), num_class=3 but found 3 in label

If I change the num_class to 2 I get the same error. If I increase the num_class to 4 then the model runs, but I get 600 predicted probabilities back, which makes sense for 4 classes.

I'm not sure if I'm making an error or whether I'm failing to understand how xgboost works. Any help would be appreciated.

like image 504
House Avatar asked Mar 18 '16 14:03

House


2 Answers

label must be in [0, num_class) in your script add y<-y-1 before model <-...

like image 165
RustamA Avatar answered Oct 21 '22 22:10

RustamA


I ran into this rather weird problem as well. It seemed in my class to be a result of not properly encoding the labels.

First, using a string vector with N classes as the labels, I could only get the algorithm to run by setting num_class = N + 1. However, this result was useless, because I only had N actual classes and N+1 buckets of predicted probabilities.

I re-encoded the labels as integers and then num_class worked fine when set to N.

# Convert classes to integers for xgboost
class <- data.table(interest_level=c("low", "medium", "high"), class=c(0,1,2))
t1    <- merge(t1, class, by="interest_level", all.x=TRUE, sort=F)

and

param <- list(booster="gbtree",
              objective="multi:softprob",
              eval_metric="mlogloss",
              #nthread=13,
              num_class=3,
              eta_decay = .99,
              eta = .005,
              gamma = 1,
              max_depth = 4,
              min_child_weight = .9,#1,
              subsample = .7,
              colsample_bytree = .5
)

For example.

like image 33
Hack-R Avatar answered Oct 21 '22 23:10

Hack-R