Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the difference between these two ways of adding Neural Network layers in Keras?

I'm using Keras with a Theano as a backend and I have Sequential neural network model.

I wonder is there a difference between following?

model.add(Convolution2D(32, 3, 3, activation='relu'))

and

model.add(Convolution2D(32, 3, 3))
model.add(Activation('relu'))
like image 876
angubenko Avatar asked May 09 '16 03:05

angubenko


People also ask

What types of neural network layers does Keras support?

Keras layers Core layers include Dense (dot product plus bias), Activation (transfer function or neuron shape), Dropout (randomly set a fraction of input units to 0 at each training update to avoid overfitting), Lambda (wrap an arbitrary expression as a Layer object), and several others.

What is sequential and dense in Keras?

In Keras, "dense" usually refers to a single layer, whereas "sequential" usually refers to an entire model, not just one layer. So I'm not sure the comparison between "Dense vs. Sequential" makes sense. Sequential refers to the way you build models in Keras using the sequential api ( from keras.

What is the common layer type from the Keras API?

Dense Layer Dense Layer is a widely used Keras layer for creating a deeply connected layer in the neural network where each of the neurons of the dense layers receives input from all neurons of the previous layer.


1 Answers

They are essentially the same. The advantage of putting it separately is that you can add other layers (say BatchNormalization) in between.

In Keras, if not specified, Convolution2D will use the 'linear' activation by default, which is just the identity function

def linear(x):
    '''
    The function returns the variable that is passed in, so all types work.
    '''
    return x 

and all that Activation layer does is applying the activation function to the input

def call(self, x, mask=None):
    return self.activation(x)

Edit:

So basically Convolution2D(activation = 'relu') applies relu activation function after performing convolution, which is the same as applying Activation('relu') after Convolution2D(32, 3, 3)

the last two lines of the call function of the Convolution2D layer are

output = self.activation(output)
return output

where output is the output of convolution. So we know applying the activation function is the last step of Convolution2D.

Source code:
Convolution2D layer: https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/convolutional.py
Activation layer: https://github.com/fchollet/keras/blob/a981a8c42c316831183cac7598266d577a1ea96a/keras/layers/core.py
activation functions: https://github.com/fchollet/keras/blob/master/keras/activations.py

like image 71
dontloo Avatar answered Nov 15 '22 21:11

dontloo