how to deal with [Warning] No further splits with positive gain, best gain: -inf is there any parameters not suit?
In order to get better accuracy, one can use a large max_bin , use a small learning rate with large num_iterations , and use more training data. One can also use many num_leaves , but it may lead to overfitting. Speaking of overfitting, you can deal with it by: Increasing path_smooth.
The min_data_in_leaf parameter is a way to reduce overfitting. It requires each leaf to have the specified number of observations so that the model does not become too specific. The validation loss is almost the same but the difference got smaller which means the degree of overfitting reduced.
Some explanation from lightGBM's issues:
it means the learning of tree in current iteration should be stop, due to cannot split any more.
I think this is caused by "min_data_in_leaf":1000, you can set it to a smaller value.
This is not a bug, it is a feature.
The output message is to warn user that your parameters may be wrong, or your dataset is not easy to learn.
link: https://github.com/Microsoft/LightGBM/issues/640
So on the contrary, the data is hard to fit.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With