Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use advanced activation layers in Keras?

This is my code that works if I use other activation layers like tanh:

model = Sequential() act = keras.layers.advanced_activations.PReLU(init='zero', weights=None) model.add(Dense(64, input_dim=14, init='uniform')) model.add(Activation(act)) model.add(Dropout(0.15)) model.add(Dense(64, init='uniform')) model.add(Activation('softplus')) model.add(Dropout(0.15)) model.add(Dense(2, init='uniform')) model.add(Activation('softmax'))  sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True) model.compile(loss='binary_crossentropy', optimizer=sgd) model.fit(X_train, y_train, nb_epoch=20, batch_size=16, show_accuracy=True, validation_split=0.2, verbose = 2) 

In this case, it doesn't work and says "TypeError: 'PReLU' object is not callable" and the error is called at the model.compile line. Why is this the case? All the non-advanced activation functions works. However, neither of the advanced activation functions, including this one, works.

like image 885
pr338 Avatar asked Jan 11 '16 08:01

pr338


People also ask

What is advanced activation in keras?

Activations that are more complex than a simple TensorFlow function (eg. learnable activations, which maintain a state) are available as Advanced Activation layers, and can be found in the module tf. keras. layers.

Can we use different activation functions in different layers?

In keras , we can use different activation function for each layer. That means that in our case we have to decide what activation function we should be utilized in the hidden layer and the output layer, in this post, I will experiment only on the hidden layer but it should be relevant also to the final layer.


2 Answers

The correct way to use the advanced activations like PReLU is to use it with add() method and not wrapping it using Activation class. Example:

model = Sequential() act = keras.layers.advanced_activations.PReLU(init='zero', weights=None) model.add(Dense(64, input_dim=14, init='uniform')) model.add(act) 
like image 83
Tarantula Avatar answered Sep 30 '22 19:09

Tarantula


If using the Model API in Keras you can call directly the function inside the Keras Layer. Here's an example:

from keras.models import Model from keras.layers import Dense, Input # using prelu? from keras.layers.advanced_activations import PReLU  # Model definition # encoder inp = Input(shape=(16,)) lay = Dense(64, kernel_initializer='uniform',activation=PReLU(),             name='encoder')(inp) #decoder out = Dense(2,kernel_initializer='uniform',activation=PReLU(),              name='decoder')(lay)  # build the model model = Model(inputs=inp,outputs=out,name='cae') 
like image 26
Mattia Paterna Avatar answered Sep 30 '22 18:09

Mattia Paterna