What should be better path to convert a scikit model (e.g. the result of a RandomForestClassifier fit) in a piece of C++ to get the the fastest .so
that can be called from some other ecosystem ?
To save the model all we need to do is pass the model object into the dump() function of Pickle. This will serialize the object and convert it into a “byte stream” that we can save as a file called model.
You can use the pickle operation to serialize your machine learning algorithms and save the serialized format to a file. Later you can load this file to deserialize your model and use it to make new predictions.
For portability of trained scikit learn models to other languages, see the sklearn-porter project.
Though, whether this will be faster than the originalRandomForestClassifier.predict
method (which is multithreaded and uses numpy operations, potentially with a fast BLAS library) remains to be seen.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With