Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do you use Keras LeakyReLU in Python?

I am trying to produce a CNN using Keras, and wrote the following code:

batch_size = 64 epochs = 20 num_classes = 5  cnn_model = Sequential() cnn_model.add(Conv2D(32, kernel_size=(3, 3), activation='linear',                      input_shape=(380, 380, 1), padding='same')) cnn_model.add(Activation('relu')) cnn_model.add(MaxPooling2D((2, 2), padding='same')) cnn_model.add(Conv2D(64, (3, 3), activation='linear', padding='same')) cnn_model.add(Activation('relu')) cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same')) cnn_model.add(Conv2D(128, (3, 3), activation='linear', padding='same')) cnn_model.add(Activation('relu')) cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same')) cnn_model.add(Flatten()) cnn_model.add(Dense(128, activation='linear')) cnn_model.add(Activation('relu')) cnn_model.add(Dense(num_classes, activation='softmax'))  cnn_model.compile(loss=keras.losses.categorical_crossentropy,                   optimizer=keras.optimizers.Adam(), metrics=['accuracy']) 

I want to use Keras's LeakyReLU activation layer instead of using Activation('relu'). However, I tried using LeakyReLU(alpha=0.1) in place, but this is an activation layer in Keras, and I get an error about using an activation layer and not an activation function.

How can I use LeakyReLU in this example?

like image 401
Jack Trute Avatar asked Feb 16 '18 14:02

Jack Trute


People also ask

What is LeakyReLU in Keras?

LeakyReLU classLeaky version of a Rectified Linear Unit. It allows a small gradient when the unit is not active: f(x) = alpha * x if x < 0 f(x) = x if x >= 0.

How do I use ReLU leak in Python?

The Leaky ReLu function is an improvisation of the regular ReLu function. To address the problem of zero gradient for negative value, Leaky ReLu gives an extremely small linear component of x to negative inputs. Mathematically: f(x)=1 (x<0)

Why do I have LeakyReLU?

Leaky ReLU has two benefits: It fixes the “dying ReLU” problem, as it doesn't have zero-slope parts. It speeds up training. There is evidence that having the “mean activation” be close to 0 makes training faster.

How do you write leaky ReLU?

LeakyRelu allows a small gradient when the unit is not active (negative): f(x)=alpha∗xforx<0, f(x)=xforx>=0.


1 Answers

All advanced activations in Keras, including LeakyReLU, are available as layers, and not as activations; therefore, you should use it as such:

from keras.layers import LeakyReLU  # instead of cnn_model.add(Activation('relu')) # use cnn_model.add(LeakyReLU(alpha=0.1)) 
like image 89
desertnaut Avatar answered Oct 14 '22 07:10

desertnaut