I'm playing with Keras a little bit and I'm thinking about what is the difference between linear activation layer and no activation layer at all? Doesn't it have the same behavior? If so, what's the point of linear activation then?
I mean the difference between these two code snippets:
model.add(Dense(1500))
model.add(Activation('linear'))
model.add(Dense(1500))
and
model.add(Dense(1500))
model.add(Dense(1500))
If you don't assign in Dense layer it is linear activation. This is from keras documentation.
activation: Activation function to use (see activations). If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x)
You can only add Activation
if you want to use other than 'linear'
.
model.add(Dense(1500))
model.add(Activation('relu'))
model.add(Dense(1500))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With