Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to disable dropout while prediction in keras?

I am using dropout in neural network model in keras. Little bit code is like

model.add(Dropout(0.5)) model.add(Dense(classes)) 

For testing, I am using preds = model_1.predict_proba(image).

But while testing Dropout is also participating to predict the score which should not be happen. I search a lot to disable the dropout but didn't get any hint yet.

Do anyone have solution to disable the Dropout while testing in keras??

like image 720
Akhilesh Avatar asked Dec 13 '17 06:12

Akhilesh


People also ask

Is dropout used in inference?

Dropout as regularization has been used extensively to prevent overfitting for training neural networks. During training, units and their connections are randomly dropped, which could be considered as sampling many different submodels from the original model.

Is dropout used in CNNs?

We can apply a Dropout layer to the input vector, in which case it nullifies some of its features; but we can also apply it to a hidden layer, in which case it nullifies some hidden neurons. Dropout layers are important in training CNNs because they prevent overfitting on the training data.

Does dropout increase speed?

Controlled dropout: A different dropout for improving training speed on deep neural network. Abstract: Dropout is a technique widely used for preventing overfitting while training deep neural networks. However, applying dropout to a neural network typically increases the training time.

When should you not use dropout?

When to Not Use Dropout? Don't use dropout after the training phase as you do not want to ignore inputs and signals when the network is in use. Moreover, using dropout in the last layer is not ideal. Convolutional layers should not use the 1-D version of dropout.


2 Answers

Keras does this by default. In Keras dropout is disabled in test mode. You can look at the code here and see that they use the dropped input in training and the actual input while testing.

As far as I know you have to build your own training function from the layers and specify the training flag to predict with dropout (e.g. its not possible to specify a training flag for the predict functions). This is a problem in case you want to do GANs, which use the intermediate output for training and also train the network as a whole, due to a divergence between generated training images and generated test images.

like image 96
Thomas Pinetz Avatar answered Oct 07 '22 18:10

Thomas Pinetz


As previously stated, dropout in Keras happens only at train time (with proportionate weight adjustment during training such that learned weights are appropriate for prediction when dropout is disabled).

This is not ideal for cases in which we wish to use a dropout NNET as a probabilistic predictor (such that it produces a distribution when asked to predict the same inputs repeatedly). In other words, Keras' Dropout layer is designed to give you regularization at train time, but the "mean function" of the learned distribution when predicting.

If you want to retain dropout for prediction, you can easily implement a permanent dropout ("PermaDropout") layer (this was based on suggestions made by F. Chollet on the GitHub discussion area for Keras):

from keras.layers.core import Lambda from keras import backend as K  def PermaDropout(rate):     return Lambda(lambda x: K.dropout(x, level=rate)) 

By replacing any dropout layer in a Keras model with "PermaDropout", you'll get the probabilistic behavior in prediction as well.

# define the LSTM model n_vocab = text_to_train.n_vocab  model = Sequential() model.add(LSTM(n_vocab*4,            input_shape=input_shape,            return_sequences=True)) # Replace Dropout with PermaDropout # model.add(Dropout(0.3) model.add(PermaDropout(0.3)) model.add(LSTM(n_vocab*2)) # Replace Dropout with PermaDropout # model.add(Dropout(0.3) model.add(PermaDropout(0.3)) #model.add(Dense(n_vocab*2)) model.add(Dense(n_vocab, activation='softmax')) model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) 
like image 25
T3am5hark Avatar answered Oct 07 '22 17:10

T3am5hark