Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to implement PReLU activation in Tensorflow?

Tags:

tensorflow

The Parametric Rectified Linear Unit (PReLU) is an interesting and widely used activation function. It seems that Tensorflow (reference link) does not provide PReLU. I know that the higher level libraries, such as Keras and TFLearn, has the implementation of it.

I would like to know how to implement PReLU in Tensorflow?

like image 295
Hasnat Avatar asked Oct 11 '16 10:10

Hasnat


People also ask

What is PReLU activation function?

A Parametric Rectified Linear Unit, or PReLU, is an activation function that generalizes the traditional rectified unit with a slope for negative values. Formally: f ( y i ) = y i if y i ≥ 0 f ( y i ) = a i y i if y i ≤ 0.

What is PReLU in keras?

PReLU classParametric Rectified Linear Unit.

What is default activation in TensorFlow?

In the TensorFlow Python API, the default value for the activation kwarg of tf. layers. dense is None , then in the documentation it says: activation: Activation function to use. If you don't specify anything, no activation is applied (ie.


1 Answers

The implementation of PReLU seems straight-forward based on the PreLU implementations (see: Keras, TFLearn and TensorLayer) of the higher level libraries. My code is as follows:

def parametric_relu(_x):
  alphas = tf.get_variable('alpha', _x.get_shape()[-1],
                       initializer=tf.constant_initializer(0.0),
                        dtype=tf.float32)
  pos = tf.nn.relu(_x)
  neg = alphas * (_x - abs(_x)) * 0.5

  return pos + neg
like image 127
Hasnat Avatar answered Jan 01 '23 21:01

Hasnat