Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras - using activation function with a parameter

Tags:

keras

How is it possible to use leaky ReLUs in the newest version of keras? Function relu() accepts an optional parameter 'alpha', that is responsible for the negative slope, but I cannot figure out how to pass ths paramtere when constructing a layer.

This line is how I tried to do it,

model.add(Activation(relu(alpha=0.1))

but then I get the error

TypeError: relu() missing 1 required positional argument: 'x'

How can I use a leaky ReLU, or any other activation function with some parameter?

like image 301
Lugi Avatar asked Jun 02 '17 13:06

Lugi


People also ask

What is activation =' ReLU in keras?

Relu activation function in keras and why is it used The Rectified Linear Unit is the most commonly used activation function in deep learning models. The function returns 0 if it receives any negative input, but for any positive value x it returns that value back.


1 Answers

relu is a function and not a class and it takes the input to the activation function as the parameter x. The activation layer takes a function as the argument, so you could initialize it with a lambda function through input x for example:

model.add(Activation(lambda x: relu(x, alpha=0.1)))
like image 133
Thomas Jungblut Avatar answered Oct 15 '22 21:10

Thomas Jungblut