I am using Gradient boosting for classification. Though the result is improving but I am getting NaN in validdeviance.
Model = gbm.fit(
x= x_Train ,
y = y_Train ,
distribution = "bernoulli",
n.trees = GBM_NTREES ,
shrinkage = GBM_SHRINKAGE ,
interaction.depth = GBM_DEPTH ,
n.minobsinnode = GBM_MINOBS ,
verbose = TRUE
)
Result
How to tune the parameter to get the validdeviance.
I had the same issue, strangely, we're few on this one ...
Adding train.fraction = 0.5
to the option list solves the issue (it seems there is no default value, and validdeviance is not computed without the train.fraction value explicitly mentioned).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With