Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What are different options for objective functions available in xgboost.XGBClassifier?

Tags:

Apart from binary:logistic (which is the default objective function), Is there any other built-in objective function that can be used in xbgoost.XGBClassifier() ?

like image 209
Venkatachalam Avatar asked Aug 22 '17 10:08

Venkatachalam


People also ask

What is the objective function in XGBoost?

XGBoost Loss for Regression The XGBoost objective function used when predicting numerical values is the “reg:squarederror” loss function. “reg:squarederror”: Loss function for regression predictive modeling problems.

What are the parameters in XGBoost?

Before running XGBoost, we must set three types of parameters: general parameters, booster parameters and task parameters. Learning task parameters decide on the learning scenario. For example, regression tasks may use different parameters with ranking tasks.

What does Eval_metric do in XGBoost?

XGBoost Python api provides a method to assess the incremental performance by the incremental number of trees. It uses two arguments: “eval_set” — usually Train and Test sets — and the associated “eval_metric” to measure your error on these evaluation sets.

What is Max_depth in XGBoost?

The maximum depth can be specified in the XGBClassifier and XGBRegressor wrapper classes for XGBoost in the max_depth parameter. This parameter takes an integer value and defaults to a value of 3. We can tune this hyperparameter of XGBoost using the grid search infrastructure in scikit-learn on the Otto dataset.


2 Answers

That's true that binary:logistic is the default objective for XGBClassifier, but I don't see any reason why you couldn't use other objectives offered by XGBoost package. For example, you can see in sklearn.py source code that multi:softprob is used explicitly in multiclass case.

Moreover, if it's really necessary, you can provide a custom objective function (details here).

like image 107
kawa Avatar answered Oct 01 '22 17:10

kawa


The default objective for XGBClassifier is ['reg:linear] however there are other parameters as well.. binary:logistic-It returns predicted probabilities for predicted class multi:softmax - Returns hard class for multiclass classification multi:softprob - It Returns probabilities for multiclass classification

Note: when using multi:softmax as objective, you need to pass num_class also as num_class is number of parameters defining number of class such as for labelliing (0,1,2), here we have 3 classes, so num_class = 3

like image 36
S Bhupendra Adhikari Avatar answered Oct 01 '22 16:10

S Bhupendra Adhikari