I have a very basic query. I have made 4 almost identical(Difference being input shapes) CNN and have merged them while connecting to a Feed Forward Network of fully connected layers.
Code for the almost identical CNN(s):
model3 = Sequential()
model3.add(Convolution2D(32, (3, 3), activation='relu', padding='same',
input_shape=(batch_size[3], seq_len, channels)))
model3.add(MaxPooling2D(pool_size=(2, 2)))
model3.add(Dropout(0.1))
model3.add(Convolution2D(64, (3, 3), activation='relu', padding='same'))
model3.add(MaxPooling2D(pool_size=(2, 2)))
model3.add(Flatten())
But on tensorboard I see all the Dropout layers are interconnected, and Dropout1 is of different color than Dropout2,3,4,etc which all are the same color.
Dropout may be implemented on any or all hidden layers in the network as well as the visible or input layer. It is not used on the output layer. The term “dropout” refers to dropping out units (hidden and visible) in a neural network.
Dropout can be used after convolutional layers (e.g. Conv2D) and after pooling layers (e.g. MaxPooling2D). Often, dropout is only used after the pooling layers, but this is just a rough heuristic. In this case, dropout is applied to each element or cell within the feature maps.
What is a Dropout? The term “dropout” refers to dropping out the nodes (input and hidden layer) in a neural network (as seen in Figure 1). All the forward and backwards connections with a dropped node are temporarily removed, thus creating a new network architecture out of the parent network.
A dropout layer does not have any trainable parameters, they can be updated during training. Dropout is a technique which is used to get hold of Overfitting. This method takes input between 0 and 1.
I know this is an old question but I had the same issue myself and just now I realized what's going on
Dropout is only applied if we're training the model. This should be deactivated by the time we're evaluating/predicting. For that purpose, keras creates a learning_phase
placeholder, set to 1.0
if we're training the model.
This placeholder is created inside the first Dropout
layer you create and is shared across all of them. So that's what you're seeing there!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With