Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

using leaky relu in Tensorflow

How can I change G_h1 = tf.nn.relu(tf.matmul(z, G_W1) + G_b1) to leaky relu? I have tried looping over the tensor using max(value, 0,01*value) but I get TypeError: Using a tf.Tensor as a Python bool is not allowed.

I also tried to find the source code on relu on Tensorflow github so that I can modify it to leaky relu but I couldn't find it..

like image 879
HYUN JAE CHO Avatar asked Jul 25 '17 15:07

HYUN JAE CHO


People also ask

Can we use leaky ReLU in CNN?

According to the advantages of ReLU, LeakyReLU function is used to fix a part of the parameters to cope with the gradient death. PReLU parameters combined with PReLU are trained to construct a new CNN framework. Experimental results show the method is effective and feasibile.

How do you implement a leaky ReLU in Python?

The Leaky ReLu function is an improvisation of the regular ReLu function. To address the problem of zero gradient for negative value, Leaky ReLu gives an extremely small linear component of x to negative inputs. Mathematically: f(x)=1 (x<0)

What is leaky ReLU in keras?

A leaky ReLU is a rectified linear unit (ReLU) with a slope in the negative part of its input space. The motivation for leaky ReLUs is that vanilla ReLUs have a gradient of zero in the negative part of their input space which can harm learning. Corresponds to the Keras Leaky ReLU Layer .


Video Answer


2 Answers

You could write one based on tf.relu, something like:

def lrelu(x, alpha):
  return tf.nn.relu(x) - alpha * tf.nn.relu(-x)

EDIT

Tensorflow 1.4 now has a native tf.nn.leaky_relu.

like image 162
P-Gn Avatar answered Sep 17 '22 23:09

P-Gn


If alpha < 1 (it should be), you can use tf.maximum(x, alpha * x)

like image 22
Manolo Santos Avatar answered Sep 21 '22 23:09

Manolo Santos