Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras How to use max_value in Relu activation function

Relu function as defined in keras/activation.py is:

    def relu(x, alpha=0., max_value=None):
      return K.relu(x, alpha=alpha, max_value=max_value)

It has a max_value which can be used to clip the value. Now how can this be used/called in the code? I have tried the following: (a)

    model.add(Dense(512,input_dim=1))
    model.add(Activation('relu',max_value=250))
    assert kwarg in allowed_kwargs, 'Keyword argument not understood: 
    ' + kwarg
    AssertionError: Keyword argument not understood: max_value

(b)

    Rel = Activation('relu',max_value=250)

same error

(c)

    from keras.layers import activations
    uu = activations.relu(??,max_value=250)

The problem with this is that it expects the input to be present in the first value. The error is 'relu() takes at least 1 argument (1 given)'

So how do I make this a layer?

    model.add(activations.relu(max_value=250))

has the same issue 'relu() takes at least 1 argument (1 given)'

If this file cannot be used as layer, then there seems to be no way of specifying a clip value to Relu. This implies that the comment here https://github.com/fchollet/keras/issues/2119 closing a proposed change is wrong... Any thoughts? Thanks!

like image 653
krat Avatar asked Dec 20 '16 22:12

krat


People also ask

What is activation =' ReLU in keras?

relu function Applies the rectified linear unit activation function. With default values, this returns the standard ReLU activation: max(x, 0) , the element-wise maximum of 0 and the input tensor.

What is clipped ReLU?

A clipped ReLU layer performs a threshold operation, where any input value less than zero is set to zero and any value above the clipping ceiling is set to that clipping ceiling. This operation is equivalent to: f ( x ) = { 0 , x < 0 x , 0 ≤ x < c e i l i n g c e i l i n g , x ≥ c e i l i n g .

How do you use ReLU leak?

You can use the LeakyRelu layer, as in the python class, instead of just specifying the string name like in your example. It works similarly to a normal layer. Being able to simply write e.g. activation='relu' is made possible because of simple aliases that are created in the source code.


2 Answers

You can use the ReLU function of the Keras backend. Therefore, first import the backend:

from keras import backend as K

Then, you can pass your own function as activation using backend functionality. This would look like

def relu_advanced(x):
    return K.relu(x, max_value=250)

Then you can use it like

model.add(Dense(512, input_dim=1, activation=relu_advanced))

or

model.add(Activation(relu_advanced))

Unfortunately, you must hard code additional arguments. Therefore, it is better to use a function, that returns your function and passes your custom values:

def create_relu_advanced(max_value=1.):        
    def relu_advanced(x):
        return K.relu(x, max_value=K.cast_to_floatx(max_value))
    return relu_advanced

Then you can pass your arguments by either

model.add(Dense(512, input_dim=1, activation=create_relu_advanced(max_value=250)))

or

model.add(Activation(create_relu_advanced(max_value=250)))
like image 167
Markus Eisenbach Avatar answered Sep 26 '22 02:09

Markus Eisenbach


That is as easy as one lambda :

from keras.activations import relu
clipped_relu = lambda x: relu(x, max_value=3.14)

Then use it like this:

model.add(Conv2D(64, (3, 3)))
model.add(Activation(clipped_relu))

When reading a model saved in hdf5 use custom_objects dictionary:

model = load_model(model_file, custom_objects={'<lambda>': clipped_relu})
like image 33
hans Avatar answered Sep 24 '22 02:09

hans