I'm using Keras with a Theano as a backend and I have Sequential neural network model.
I wonder is there a difference between following?
model.add(Convolution2D(32, 3, 3, activation='relu'))
and
model.add(Convolution2D(32, 3, 3))
model.add(Activation('relu'))
Keras layers Core layers include Dense (dot product plus bias), Activation (transfer function or neuron shape), Dropout (randomly set a fraction of input units to 0 at each training update to avoid overfitting), Lambda (wrap an arbitrary expression as a Layer object), and several others.
In Keras, "dense" usually refers to a single layer, whereas "sequential" usually refers to an entire model, not just one layer. So I'm not sure the comparison between "Dense vs. Sequential" makes sense. Sequential refers to the way you build models in Keras using the sequential api ( from keras.
Dense Layer Dense Layer is a widely used Keras layer for creating a deeply connected layer in the neural network where each of the neurons of the dense layers receives input from all neurons of the previous layer.
They are essentially the same. The advantage of putting it separately is that you can add other layers (say BatchNormalization
) in between.
In Keras, if not specified, Convolution2D
will use the 'linear' activation by default, which is just the identity function
def linear(x):
'''
The function returns the variable that is passed in, so all types work.
'''
return x
and all that Activation
layer does is applying the activation function to the input
def call(self, x, mask=None):
return self.activation(x)
Edit:
So basically
Convolution2D(activation = 'relu')
applies relu activation function after performing convolution, which is the same as applyingActivation('relu')
afterConvolution2D(32, 3, 3)
the last two lines of the call
function of the Convolution2D
layer are
output = self.activation(output)
return output
where output
is the output of convolution. So we know applying the activation function is the last step of Convolution2D
.
Source code:Convolution2D
layer: https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/convolutional.pyActivation
layer: https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/core.py
activation functions: https://github.com/fchollet/keras/blob/master/keras/activations.py
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With