Sometimes the default standard activations like ReLU, tanh, softmax, ... and the advanced activations like LeakyReLU aren't enough. And it might also not be in keras-contrib.
How do you create your own activation function?
While TensorFlow already contains a bunch of activation functions inbuilt, there are ways to create your own custom activation function or to edit an existing activation function. ReLU (Rectified Linear Unit) is still the most common activation function used in the hidden layers of any neural network architecture.
Relu activation function in keras and why is it used The Rectified Linear Unit is the most commonly used activation function in deep learning models. The function returns 0 if it receives any negative input, but for any positive value x it returns that value back.
Credits to this Github issue comment by Ritchie Ng.
# Creating a model
from keras.models import Sequential
from keras.layers import Dense
# Custom activation function
from keras.layers import Activation
from keras import backend as K
from keras.utils.generic_utils import get_custom_objects
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
get_custom_objects().update({'custom_activation': Activation(custom_activation)})
# Usage
model = Sequential()
model.add(Dense(32, input_dim=784))
model.add(Activation(custom_activation, name='SpecialActivation'))
print(model.summary())
Please keep in mind that you have to import this function when you save and restore the model. See the note of keras-contrib.
Slightly simpler than Martin Thoma's answer: you can just create a custom element-wise back-end function and use it as a parameter. You still need to import this function before loading your model.
from keras import backend as K
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
model.add(Dense(32 , activation=custom_activation))
Let's say you would like to add swish
or gelu
to keras, the previous methods are nice inline insertions. But you could also insert them in the set of keras activation functions, so that you call you custom fucntion as you would call ReLU
. I tested this with keras 2.2.2 (any v2 would do). Append to this file $HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py
the definition of your custom function (can be different for you python and anaconda version).
In keras internal:
$HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py
def swish(x):
return (K.sigmoid(beta * x) * alpha *x)
Then in your python file:
$HOME/Documents/neural_nets.py
model = Sequential()
model.add(Activation('swish'))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With