Using Tensorflow 1.5, I am trying to add leaky_relu
activation to the output of a dense layer while I am able to change the alpha
of leaky_relu
(check here). I know I can do it as follows:
output = tf.layers.dense(input, n_units)
output = tf.nn.leaky_relu(output, alpha=0.01)
I was wondering if there is a way to write this in one line as we can do for relu
:
ouput = tf.layers.dense(input, n_units, activation=tf.nn.relu)
I tried the following but I get an error:
output = tf.layers.dense(input, n_units, activation=tf.nn.leaky_relu(alpha=0.01))
TypeError: leaky_relu() missing 1 required positional argument: 'features'
Is there a way to do this?
If you're really adamant about a one liner for this, you could use the partial()
method from the functools
module, as follow:
import tensorflow as tf
from functools import partial
output = tf.layers.dense(input, n_units, activation=partial(tf.nn.leaky_relu, alpha=0.01))
It should be noted that partial()
does not work for all operations and you might have to try your luck with partialmethod()
from the same module.
Hope this helps you in your endeavour.
At least on TensorFlow of version 2.3.0.dev20200515, LeakyReLU
activation with arbitrary alpha
parameter can be used as an activation
parameter of the Dense
layers:
output = tf.keras.layers.Dense(n_units, activation=tf.keras.layers.LeakyReLU(alpha=0.01))(x)
LeakyReLU
activation works as:
LeakyReLU math expression
LeakyReLU graph
More information: Wikipedia - Rectifier (neural networks)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With