Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use warm_start

Tags:

I'd like to use the warm_start parameter to add training data to my random forest classifier. I expected it to be used like this:

clf = RandomForestClassifier(...) clf.fit(get_data()) clf.fit(get_more_data(), warm_start=True) 

But the warm_start parameter is a constructor parameter. So do I do something like this?

clf = RandomForestClassifier() clf.fit(get_data()) clf = RandomForestClassifier (warm_start=True) clf.fit(get_more_data) 

That makes no sense to me. Won't the new call to the constructor discard previous training data? I think I'm missing something.

like image 942
zoran119 Avatar asked Mar 13 '17 06:03

zoran119


People also ask

What is a good max depth for random forest?

Generally, we go with a max depth of 3, 5, or 7. max_features: The number of columns that are shown to each decision tree. The specific features that are passed to each decision tree can vary between each decision tree.


2 Answers

The basic pattern of (taken from Miriam's answer):

clf = RandomForestClassifier(warm_start=True) clf.fit(get_data()) clf.fit(get_more_data()) 

would be the correct usage API-wise.

But there is an issue here.

As the docs say the following:

When set to True, reuse the solution of the previous call to fit and add more estimators to the ensemble, otherwise, just fit a whole new forest.

it means, that the only thing warm_start can do for you, is adding new DecisionTree's. All the previous trees seem to be untouched!

Let's check this with some sources:

  n_more_estimators = self.n_estimators - len(self.estimators_)      if n_more_estimators < 0:         raise ValueError('n_estimators=%d must be larger or equal to '                          'len(estimators_)=%d when warm_start==True'                          % (self.n_estimators, len(self.estimators_)))      elif n_more_estimators == 0:         warn("Warm-start fitting without increasing n_estimators does not "              "fit new trees.") 

This basically tells us, that you would need to increase the number of estimators before approaching a new fit!

I have no idea what kind of usage sklearn expects here. I'm not sure, if fitting, increasing internal variables and fitting again is correct usage, but i somehow doubt it (especially as n_estimators is not a public class-variable).

Your basic approach (in regards to this library and this classifier) is probably not a good idea for your out-of-core learning here! I would not pursue this further.

like image 57
sascha Avatar answered Sep 28 '22 16:09

sascha


Just to add to excellent @sascha`s answer, this hackie method works:

rf = RandomForestClassifier(n_estimators=1, warm_start=True)                      rf.fit(X_train, y_train) rf.n_estimators += 1 rf.fit(X_train, y_train)  
like image 22
Sergey Makarevich Avatar answered Sep 28 '22 15:09

Sergey Makarevich