Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the difference between a layer with a linear activation and a layer without activation?

I'm playing with Keras a little bit and I'm thinking about what is the difference between linear activation layer and no activation layer at all? Doesn't it have the same behavior? If so, what's the point of linear activation then?

I mean the difference between these two code snippets:

 model.add(Dense(1500))
 model.add(Activation('linear'))
 model.add(Dense(1500))

and

 model.add(Dense(1500))
 model.add(Dense(1500))
like image 256
T.Poe Avatar asked May 03 '19 07:05

T.Poe


1 Answers

If you don't assign in Dense layer it is linear activation. This is from keras documentation.

activation: Activation function to use (see activations). If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x)

You can only add Activation if you want to use other than 'linear'.

model.add(Dense(1500))
model.add(Activation('relu'))
model.add(Dense(1500))
like image 136
Krunal V Avatar answered Oct 09 '22 06:10

Krunal V