How is it possible to use leaky ReLUs in the newest version of keras? Function relu() accepts an optional parameter 'alpha', that is responsible for the negative slope, but I cannot figure out how to pass ths paramtere when constructing a layer.
This line is how I tried to do it,
model.add(Activation(relu(alpha=0.1))
but then I get the error
TypeError: relu() missing 1 required positional argument: 'x'
How can I use a leaky ReLU, or any other activation function with some parameter?
Relu activation function in keras and why is it used The Rectified Linear Unit is the most commonly used activation function in deep learning models. The function returns 0 if it receives any negative input, but for any positive value x it returns that value back.
relu
is a function and not a class and it takes the input to the activation function as the parameter x
. The activation layer takes a function as the argument, so you could initialize it with a lambda function through input x
for example:
model.add(Activation(lambda x: relu(x, alpha=0.1)))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With