The Parametric Rectified Linear Unit (PReLU) is an interesting and widely used activation function. It seems that Tensorflow (reference link) does not provide PReLU. I know that the higher level libraries, such as Keras and TFLearn, has the implementation of it.
I would like to know how to implement PReLU in Tensorflow?
A Parametric Rectified Linear Unit, or PReLU, is an activation function that generalizes the traditional rectified unit with a slope for negative values. Formally: f ( y i ) = y i if y i ≥ 0 f ( y i ) = a i y i if y i ≤ 0.
PReLU classParametric Rectified Linear Unit.
In the TensorFlow Python API, the default value for the activation kwarg of tf. layers. dense is None , then in the documentation it says: activation: Activation function to use. If you don't specify anything, no activation is applied (ie.
The implementation of PReLU seems straight-forward based on the PreLU implementations (see: Keras, TFLearn and TensorLayer) of the higher level libraries. My code is as follows:
def parametric_relu(_x):
alphas = tf.get_variable('alpha', _x.get_shape()[-1],
initializer=tf.constant_initializer(0.0),
dtype=tf.float32)
pos = tf.nn.relu(_x)
neg = alphas * (_x - abs(_x)) * 0.5
return pos + neg
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With