Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to save and load MLLib model in Apache Spark?

I trained a classification model in Apache Spark (using pyspark). I stored the model in an object, LogisticRegressionModel. Now, I want to make predictions on new data. I would like to store the model, and read it back into a new program in order to make the predictions. Any idea how to store the model? I'm thinking of maybe pickle, but I'm a newbie to both python and Spark, so I'd like to hear what the community thinks.

like image 833
berto77 Avatar asked Dec 14 '15 15:12

berto77


1 Answers

You can save your model by using the save method of mllib models.

# let lrm be a LogisticRegression Model
lrm.save(sc, "lrm_model.model")

After storing it you can load it in another application.

sameModel = LogisticRegressionModel.load(sc, "lrm_model.model")

As @zero323 stated before, there is another way to achieve this, and is by using the Predictive Model Markup Language (PMML).

is an XML-based file format developed by the Data Mining Group to provide a way for applications to describe and exchange models produced by data mining and machine learning algorithms.

like image 129
Alberto Bonsanto Avatar answered Oct 11 '22 14:10

Alberto Bonsanto