from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.model_selection import GridSearchCV
def build_classifier():
classifier = Sequential()
classifier.add(Dense(units = 6 , init='uniform' , activation= 'relu'))
classifier.add(Dense(units = 6 , init='uniform' , activation= 'relu'))
classifier.add(Dense(units = 1 , init='uniform' , activation= 'sigmoid'))
classifier.compile(optimizer='adam' , loss = 'binary_crossentropy' ,
metrics=['accuracy'])
return classifier
KC = KerasClassifier(build_fn=build_classifier)
parameters = {'batch_size' : [25,32],
'epochs' : [100,500],
'optimizer':['adam','rmsprop']}
grid_search = GridSearchCV(estimator=KC ,
param_grid=parameters,scoring='accuracy',cv=10)
grid_search.fit(X_train,y_train)
I wanna test the model with different optimizer. But I can't seem to add optimizer in grid search. Whenever I run the program, it shows error regarding to fitting the training set.
ValueError: optimizer is not a legal parameter
Cross-Validation and GridSearchCVCross-Validation is used while training the model. As we know that before training the model with data, we divide the data into two parts – train data and test data. In cross-validation, the process divides the train data further into two parts – the train data and the validation data.
Only ~7.5k records were used for training with cv=3, and ~3k records for testing purpose. Observing the above time numbers, for parameter grid having 3125 combinations, the Grid Search CV took 10856 seconds (~3 hrs) whereas Halving Grid Search CV took 465 seconds (~8 mins), which is approximate 23x times faster.
GridSearchCV tries all the combinations of the values passed in the dictionary and evaluates the model for each combination using the Cross-Validation method. Hence after using this function we get accuracy/loss for every combination of hyperparameters and we can choose the one with the best performance.
The documentation of keras for scikit-learn says:
sk_params takes both model parameters and fitting parameters. Legal model parameters are the arguments of build_fn. Note that like all other estimators in scikit-learn, build_fn should provide default values for its arguments, so that you could create the estimator without passing any values to sk_params.
GridSearchCV
will call get_params()
on KerasClassifier
to get a list of valid parameters that can be passed to it which according to your code:
KC = KerasClassifier(build_fn=build_classifier)
will be empty (since you are not specifying any parameters in the build_classifier
).
Change that to something like:
# Used a parameter to specify the optimizer
def build_classifier(optimizer = 'adam'):
...
classifier.compile(optimizer=optimizer , loss = 'binary_crossentropy' ,
metrics=['accuracy'])
...
return classifier
After that it should work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With