In R, after running "random forest" model, I can use save.image("***.RData")
to store the model. Afterwards, I can just load the model to do predictions directly.
Can you do a similar thing in python? I separate the Model and Prediction into two files. And in Model file:
rf= RandomForestRegressor(n_estimators=250, max_features=9,compute_importances=True) fit= rf.fit(Predx, Predy)
I tried to return rf
or fit
, but still can't load the model in the prediction file.
Can you separate the model and prediction using the sklearn random forest package?
To save the model all we need to do is pass the model object into the dump() function of Pickle. This will serialize the object and convert it into a “byte stream” that we can save as a file called model.
Steps involved in random forest algorithm: Step 1: In Random forest n number of random records are taken from the data set having k number of records. Step 2: Individual decision trees are constructed for each sample. Step 3: Each decision tree will generate an output.
For example to save some attributes of the model, so the next time it isn't necessary to call again the fit function to train my model. For example, for GMM I would save the weights_ , means_ and covs_ of each component, so for later I wouldn't need to train the model again.
... import cPickle rf = RandomForestRegresor() rf.fit(X, y) with open('path/to/file', 'wb') as f: cPickle.dump(rf, f) # in your prediction file with open('path/to/file', 'rb') as f: rf = cPickle.load(f) preds = rf.predict(new_X)
You can use joblib
to save and load the Random Forest from scikit-learn (in fact, any model from scikit-learn)
The example:
import joblib from sklearn.ensemble import RandomForestClassifier # create RF rf = RandomForestClassifier() # fit on some data rf.fit(X, y) # save joblib.dump(rf, "my_random_forest.joblib") # load loaded_rf = joblib.load("my_random_forest.joblib")
What is more, the joblib.dump
has compress
argument, so the model can be compressed. I made very simple test on iris dataset and compress=3
reduces the size of the file about 5.6 times.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With