Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Weights argument in R gbm function

What is the weights argument for in the R gbm function? Does it implement cost-sensitive stochastic gradient boosting?

like image 798
Antoine Avatar asked Sep 29 '22 06:09

Antoine


1 Answers

You may have already read this, but the documentation says that the weights parameter is defined in this way:

an optional vector of weights to be used in the fitting process. Must be positive but do not need to be normalized. If keep.data=FALSE in the initial call to gbm then it is the user’s responsibility to resupply the weights to gbm.more.

Thus my interpretation would be that they are standard observation weights as in any statistical model.

Is it cost-sensitive? Good question. I first noticed that one of the main citations for the package is:

B. Kriegler (2007). Cost-Sensitive Stochastic Gradient Boosting Within a Quantitative Regression Framework.

so I figured it does imply cost-sensitivity, but there's not an explicit use of that term in the vignette, so if it was not apparent.

I did a little bit of a deeper dive though and found some more resources. You can find the equations describing the weights towards the end of this article which describes the package.

I also found this question being asked way back in 2009 in a mailing list, and while there was no response, I finally found a a scholarly article discussing the use of gbm and other R packages for cost-sensitive gradient boosting.

The conclusion is that gbm's quantile loss function is differentiable and can be used in cost-sensitive applications wherein over/under-estimation have different error costs, however other quantitative loss functions (aside from quantile) may be necessary/appropriate in some applications of cost-sensitive gradient boosting.

That paper is centered around gbm but also discusses other packages and if your focus is on cost-sensitive gradient boosting then you may want to look at the others they mention in the paper as well.

like image 186
Hack-R Avatar answered Oct 19 '22 06:10

Hack-R