I want to reproduce the custom loss function for LightGBM. This is what I tried:
lgb.train(params=params, train_set=dtrain, num_boost_round=num_round, fobj=default_mse_obj)
With default_mse_obj being defined as:
residual = y_true - y_pred.get_label()
grad = -2.0*residual
hess = 2.0+(residual*0)
return grad, hess
However, eval metrics are different for the default "regression" objective, compared to the custom loss function defined. I would like to know, what is the default function used by LightGBM for the "regression" objective?
as you can see here, this is the default loss function for regression task
def default_mse_obj(y_pred, dtrain):
y_true = dtrain.get_label()
grad = (y_pred - y_true)
hess = np.ones(len(grad))
return grad, hess
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With