Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

GridsearchCV with Exception

I'm trying GridsearchCV but I would like to have some exceptions in param grid. Here's my grid search code:

from sklearn.model_selection import GridSearchCV
from keras.wrappers.scikit_learn import KerasClassifier

def create_model(input_dim=25, activation='relu', units=100, optimizer = 'adam', init='he_normal', dropout_rate=0.33):

       model = Sequential()
       model.add(Dense(input_dim=input_dim,
                       units=units, 
                       kernel_initializer=init, 
                       activation=activation))
       model.add(Dropout(dropout_rate))
       model.add(Dense(1, kernel_initializer=init, activation='sigmoid'))
       model.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=['accuracy'])
       return model

model = KerasClassifier(build_fn=create_model, epochs=10, batch_size=64, verbose=1)

#lr = [0.01, 0.001, 0.0001] # [x/100 for x in range(1, 10)] #learning rate for optimizer
units = [int(train_X.shape[1]/2), train_X.shape[1], train_X.shape[1]*2, train_X.shape[1]*3]
batch_size = [32, 64, 128, 256]
optimizer = ['SGD', 'RMSprop', 'Adagrad', 'Adadelta', 'Adam', 'Adamax', 'Nadam']
epochs = [50, 100, 200]
init = ['uniform', 'lecun_uniform', 'normal', 'zero', 'glorot_normal', 'glorot_uniform', 'he_normal', 'he_uniform']
activation = ['softmax', 'softplus', 'softsign', 'relu', 'tanh', 'sigmoid', 'hard_sigmoid', 'linear']
dropout_rate = [0.2, 0.3, 0.4, 0.5]

cv = [(slice(None), slice(None))]

param_grid = dict(units=units, batch_size=batch_size, optimizer=optimizer, epochs=epochs, 
                  init=init, activation=activation, dropout_rate=dropout_rate)

grid = GridSearchCV(cv=cv, estimator=model, param_grid=param_grid, n_jobs=1)
grid_result = grid.fit(train_X, train_y, validation_data=(valid_X, valid_y))

when I runt his code, even though KerasClassifier has the parameter of epochs=10, the grid never runs 10 epochs but runs for [50, 100, 200] which I provided in the grid. It's like the epochs=10 was overwritten.

Now what I want to do is, I would like to use different activation functions in first layer but keep Sigmoid in the output layer. What I'm afraid is, will the activation='sigmoid' parameter be overwritten by my activation = ['softmax', 'softplus', 'softsign', 'relu', 'tanh', 'sigmoid', 'hard_sigmoid', 'linear'] parameter that will come from grid?

I hope this one makes more sense to you.

like image 656
iso_9001_ Avatar asked Mar 18 '26 18:03

iso_9001_


1 Answers

I'm not familiar with KerasClassifier, but I think you're attempting to accomplish too much without understanding what's going on. With the parameters you're currently testing, you'll be running 21,504 iterations for one CV fold! Assuming your dataset isn't completely trivial and takes at least 2 seconds to fit, you're looking at 12 hours of Grid Search right there, at least. It's been known for a while now that Grid Search isn't the most effective CV strategy for expensive functions like a neural net. Random Search and Bayesian optimization have proven to be more efficient and capable of giving comparable or better results. However, as I stated, this is getting too complicated!

I recommend reading up on all those activation functions and optimizers and learning rates and such and narrowing your search space. Do as much tailoring to your data before starting CV. Moreover, it's good practice to implement some sort of folding CV, such as K-fold or stratified k-fold. Read up on these too, they're important.

If you're still looking to implement this, you might just find it easier to manually make two for loops: an outer one to iterate over each parameter and an inner loop to iterate over each hyper-parameter. Within the inner most loop, you could build, compile, and fit your model right there without having to use sklearn or KerasClassifier at all (which is hiding a lot of important details). You could also take this opportunity to learn more about functional vs. sequential Keras, the former arguably is the more powerful.

I apologize for the non-answer, I just think you may be causing yourself more headache than necessary! Good luck.

like image 71
Alex L Avatar answered Mar 20 '26 09:03

Alex L



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!