So, I have a classifier which looks like
clf = VotingClassifier(estimators=[
('nn', MLPClassifier()),
('gboost', GradientBoostingClassifier()),
('lr', LogisticRegression()),
], voting='soft')
And I want to essentially tune the hyperparameters of each of the estimators.
Is there a way to tune these "combinations" of classifiers? Thanks
You can do this using GridSearchCV but with a little modification. In the parameters dictionary instead of specifying the attrbute directly, you need to use the key for classfier in the VotingClassfier object followed by __
and then the attribute itself.
Check out this example
from sklearn.linear_model import LogisticRegression
from sklearn.svm import SVC
from sklearn.ensemble import VotingClassifier
from sklearn.model_selection import GridSearchCV
X = np.array([[-1.0, -1.0], [-1.2, -1.4], [-3.4, -2.2], [1.1, 1.2],[-1.0, -1.0], [-1.2, -1.4], [-3.4, -2.2], [1.1, 1.2]])
y = np.array([1, 1, 2, 2,1, 1, 2, 2])
eclf = VotingClassifier(estimators=[
('svm', SVC(probability=True)),
('lr', LogisticRegression()),
], voting='soft')
#Use the key for the classifier followed by __ and the attribute
params = {'lr__C': [1.0, 100.0],
'svm__C': [2,3,4],}
grid = GridSearchCV(estimator=eclf, param_grid=params, cv=2)
grid.fit(X,y)
print (grid.best_params_)
#{'lr__C': 1.0, 'svm__C': 2}
use GridSearchCV
clf = VotingClassifier(
estimators=[('lr',LogisticRegression()), ('gboost',GradientBoostingClassifier()),]
, voting='soft')
#put the combination of parameters here
p = [{'lr__C':[1,2],'gboost__n_estimator':[10,20]}]
grid = GridSearchCV(clf,p,cv=5,scoring='neg_log_loss')
grid.fit(X_train,Y_train)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With