Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do you create a custom activation function with Keras?

Sometimes the default standard activations like ReLU, tanh, softmax, ... and the advanced activations like LeakyReLU aren't enough. And it might also not be in keras-contrib.

How do you create your own activation function?

like image 289
Martin Thoma Avatar asked May 11 '17 12:05

Martin Thoma


People also ask

Can you make your own activation function?

While TensorFlow already contains a bunch of activation functions inbuilt, there are ways to create your own custom activation function or to edit an existing activation function. ReLU (Rectified Linear Unit) is still the most common activation function used in the hidden layers of any neural network architecture.

What is activation =' ReLU in keras?

Relu activation function in keras and why is it used The Rectified Linear Unit is the most commonly used activation function in deep learning models. The function returns 0 if it receives any negative input, but for any positive value x it returns that value back.


3 Answers

Credits to this Github issue comment by Ritchie Ng.

# Creating a model
from keras.models import Sequential
from keras.layers import Dense

# Custom activation function
from keras.layers import Activation
from keras import backend as K
from keras.utils.generic_utils import get_custom_objects


def custom_activation(x):
    return (K.sigmoid(x) * 5) - 1

get_custom_objects().update({'custom_activation': Activation(custom_activation)})

# Usage
model = Sequential()
model.add(Dense(32, input_dim=784))
model.add(Activation(custom_activation, name='SpecialActivation'))
print(model.summary())

Please keep in mind that you have to import this function when you save and restore the model. See the note of keras-contrib.

like image 109
Martin Thoma Avatar answered Oct 22 '22 00:10

Martin Thoma


Slightly simpler than Martin Thoma's answer: you can just create a custom element-wise back-end function and use it as a parameter. You still need to import this function before loading your model.

from keras import backend as K

def custom_activation(x):
    return (K.sigmoid(x) * 5) - 1

model.add(Dense(32 , activation=custom_activation))
like image 33
Eponymous Avatar answered Oct 21 '22 23:10

Eponymous


Let's say you would like to add swish or gelu to keras, the previous methods are nice inline insertions. But you could also insert them in the set of keras activation functions, so that you call you custom fucntion as you would call ReLU. I tested this with keras 2.2.2 (any v2 would do). Append to this file $HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py the definition of your custom function (can be different for you python and anaconda version).

In keras internal:

$HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py

def swish(x):
    return (K.sigmoid(beta * x) * alpha *x)

Then in your python file:

$HOME/Documents/neural_nets.py

model = Sequential()
model.add(Activation('swish'))
like image 2
Julien Nyambal Avatar answered Oct 22 '22 00:10

Julien Nyambal