Apart from binary:logistic
(which is the default objective function), Is there any other built-in objective function that can be used in xbgoost.XGBClassifier()
?
XGBoost Loss for Regression The XGBoost objective function used when predicting numerical values is the “reg:squarederror” loss function. “reg:squarederror”: Loss function for regression predictive modeling problems.
Before running XGBoost, we must set three types of parameters: general parameters, booster parameters and task parameters. Learning task parameters decide on the learning scenario. For example, regression tasks may use different parameters with ranking tasks.
XGBoost Python api provides a method to assess the incremental performance by the incremental number of trees. It uses two arguments: “eval_set” — usually Train and Test sets — and the associated “eval_metric” to measure your error on these evaluation sets.
The maximum depth can be specified in the XGBClassifier and XGBRegressor wrapper classes for XGBoost in the max_depth parameter. This parameter takes an integer value and defaults to a value of 3. We can tune this hyperparameter of XGBoost using the grid search infrastructure in scikit-learn on the Otto dataset.
That's true that binary:logistic is the default objective for XGBClassifier, but I don't see any reason why you couldn't use other objectives offered by XGBoost package. For example, you can see in sklearn.py source code that multi:softprob is used explicitly in multiclass case.
Moreover, if it's really necessary, you can provide a custom objective function (details here).
The default objective for XGBClassifier is ['reg:linear] however there are other parameters as well.. binary:logistic-It returns predicted probabilities for predicted class multi:softmax - Returns hard class for multiclass classification multi:softprob - It Returns probabilities for multiclass classification
Note: when using multi:softmax as objective, you need to pass num_class also as num_class is number of parameters defining number of class such as for labelliing (0,1,2), here we have 3 classes, so num_class = 3
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With