Let's say I have an LSTM layer in Keras like this:
x = Input(shape=(input_shape), dtype='int32')
x = LSTM(128,return_sequences=True)(x)
Now I am trying to add Dropout to this layer using:
X = Dropout(0.5)
but this gives error, which I am assuming the above line is redefining X instead of adding Dropout to it. How to fix this?
Dropout Regularization in Keras Dropout is easily implemented by randomly selecting nodes to be dropped out with a given probability (e.g., 20%) in each weight update cycle. This is how Dropout is implemented in Keras.
Dropout is one of the important concept in the machine learning. It is used to fix the over-fitting issue. Input data may have some of the unwanted data, usually called as Noise. Dropout will try to remove the noise data and thus prevent the model from over-fitting.
Usually, dropout is placed on the fully connected layers only because they are the one with the greater number of parameters and thus they're likely to excessively co-adapting themselves causing overfitting.
Just add x = Dropout(0.5)(x)
like this:
x = Input(shape=(input_shape), dtype='int32')
x = LSTM(128,return_sequences=True)(x)
x = Dropout(0.5)(x)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With