Why could it be that
GradientBoostingClassifier(loss='ls')
fails with:
raise ValueError("``n_classes`` must be 1 for regression")
ValueError: ``n_classes`` must be 1 for regression
and it works perfectly with loss='deviance'
?
I'm using scikit-learn-0.11 with scipy-0.11.0rc1 in Ubuntu 64 bits This happened classifying a dataset with binary class 'YES' 'NO'.
ensemble import GradientBoostingRegressor >>> from sklearn. model_selection import train_test_split >>> X, y = make_regression(random_state=0) >>> X_train, X_test, y_train, y_test = train_test_split( ... X, y, random_state=0) >>> reg = GradientBoostingRegressor(random_state=0) >>> reg.
Gradient boosting can be used for regression and classification problems.
This is a bug in GradientBoostingClassifier
. It shouldn't expose the least squares loss function for classification. Please use the "deviance" loss function instead.
Sorry for the inconveniences caused.
PS: If you really need least squared loss for classification please contact me and we can work on this feature for a future release.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With