After identifying the best parameters using a pipeline
and GridSearchCV
, how do I pickle
/joblib
this process to re-use later? I see how to do this when it's a single classifier...
from sklearn.externals import joblib
joblib.dump(clf, 'filename.pkl')
But how do I save this overall pipeline
with the best parameters after performing and completing a gridsearch
?
I tried:
joblib.dump(grid, 'output.pkl')
- But that dumped every gridsearch
attempt (many files)joblib.dump(pipeline, 'output.pkl')
- But I
don't think that contains the best parametersX_train = df['Keyword']
y_train = df['Ad Group']
pipeline = Pipeline([
('tfidf', TfidfVectorizer()),
('sgd', SGDClassifier())
])
parameters = {'tfidf__ngram_range': [(1, 1), (1, 2)],
'tfidf__use_idf': (True, False),
'tfidf__max_df': [0.25, 0.5, 0.75, 1.0],
'tfidf__max_features': [10, 50, 100, 250, 500, 1000, None],
'tfidf__stop_words': ('english', None),
'tfidf__smooth_idf': (True, False),
'tfidf__norm': ('l1', 'l2', None),
}
grid = GridSearchCV(pipeline, parameters, cv=2, verbose=1)
grid.fit(X_train, y_train)
#These were the best combination of tuning parameters discovered
##best_params = {'tfidf__max_features': None, 'tfidf__use_idf': False,
## 'tfidf__smooth_idf': False, 'tfidf__ngram_range': (1, 2),
## 'tfidf__max_df': 1.0, 'tfidf__stop_words': 'english',
## 'tfidf__norm': 'l2'}
if you don't pickle large numpy arrays, then regular pickle can be significantly faster, especially on large collections of small python objects (e.g. a large dict of str objects) because the pickle module of the standard library is implemented in C while joblib is pure python.
You can export Pipeline objects using the version of joblib included in scikit-learn or pickle , similarly to how you export scikit-learn estimators.
import joblib
joblib.dump(grid.best_estimator_, 'filename.pkl')
If you want to dump your object into one file - use:
joblib.dump(grid.best_estimator_, 'filename.pkl', compress = 1)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With