Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to implement RBF activation function in Keras?

I am creating a customized activation function, RBF activation function in particular:

from keras import backend as K
from keras.layers import Lambda

l2_norm = lambda a,b:  K.sqrt(K.sum(K.pow((a-b),2), axis=0, keepdims=True))

def rbf2(x):
X = #here i need inputs that I receive from previous layer 
Y = # here I need weights that I should apply for this layer
l2 = l2_norm(X,Y)
res = K.exp(-1 * gamma * K.pow(l2,2))
return res

The function rbf2 receives the previous layer as input:

#some keras layers
model.add(Dense(84, activation='tanh')) #layer1
model.add(Dense(10, activation = rbf2)) #layer2

What should I do to get the inputs from layer1 and weights from layer2 to create the customized activation function?

What I am actually trying to do is, implementing the output layer for LeNet5 neural network. The output layer of LeNet-5 is a bit special, instead of computing the dot product of the inputs and the weight vector, each neuron outputs the square of the Euclidean distance between its input vector and its weight vector.

For example, layer1 has 84 neurons and layer2 has 10 neurons. In general cases, for calculating output for each of 10 neurons of layer2, we do the dot product of 84 neurons of layer1 and 84 weights in between layer1 and layer2. We then apply softmax activation function over it.

But here, instead of doing dot product, each neuron of the layer2 outputs the square of the Euclidean distance between its input vector and its weight vector (I want to use this as my activation function).

Any help on creating RBF activation function (calculating euclidean distance from inputs the layer receives and weights) and using it in the layer is also helpful.

like image 732
Vamshi Pulluri Avatar asked Dec 19 '18 17:12

Vamshi Pulluri


People also ask

What is activation =' ReLU in keras?

relu function Applies the rectified linear unit activation function. With default values, this returns the standard ReLU activation: max(x, 0) , the element-wise maximum of 0 and the input tensor.

What kind of activation functions RBF hidden layer have?

Architecture of RBF The Hidden layer of RBF consists of hidden neurons, and activation function of these neurons is a Gaussian function. Hidden layer generates a signal corresponding to an input vector in the input layer, and corresponding to this signal, network generates a response.

Can RBF be used for regression?

RBF nets are a great example of neural models being used for regression!

What is the default activation function in keras?

In Keras 2.0 all default activations are linear for all implemented RNNs ( LSTM , GRU and SimpleRNN ). In previous versions you had: linear for SimpleRNN , tanh for LSTM and GRU .


2 Answers

You can simply define a custom layer for this purpose:

from keras.layers import Layer
from keras import backend as K

class RBFLayer(Layer):
    def __init__(self, units, gamma, **kwargs):
        super(RBFLayer, self).__init__(**kwargs)
        self.units = units
        self.gamma = K.cast_to_floatx(gamma)

    def build(self, input_shape):
        self.mu = self.add_weight(name='mu',
                                  shape=(int(input_shape[1]), self.units),
                                  initializer='uniform',
                                  trainable=True)
        super(RBFLayer, self).build(input_shape)

    def call(self, inputs):
        diff = K.expand_dims(inputs) - self.mu
        l2 = K.sum(K.pow(diff,2), axis=1)
        res = K.exp(-1 * self.gamma * l2)
        return res

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.units)

Example usage:

model = Sequential()
model.add(Dense(20, input_shape=(100,)))
model.add(RBFLayer(10, 0.5))
like image 53
today Avatar answered Oct 06 '22 06:10

today


There is no need to reinvent the wheel here. A custom RBF layer for Keras already exists.

like image 21
Hagbard Avatar answered Oct 06 '22 07:10

Hagbard