Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Share weights between two dense layers in keras

I have a code as follows. What I want to do is to share the same weights in two dense layers.

The equation for op1 and op2 layer will be like that

op1 = w1y1 + w2y2 + w3y3 + w4y4 + w5y5 + b1

op2 = w1z1 + w2z2 + w3z3 + w4z4 + w5z5 + b1

here w1 to w5 weights are shared between op1 and op2 layer inputs which are (y1 to y5) and (z1 to z5) respectively.

ip_shape1 = Input(shape=(5,))
ip_shape2 = Input(shape=(5,))

op1 = Dense(1, activation = "sigmoid", kernel_initializer = "ones")(ip_shape1)
op2 = Dense(1, activation = "sigmoid", kernel_initializer = "ones")(ip_shape2)

merge_layer = concatenate([op1, op2])
predictions = Dense(1, activation='sigmoid')(merge_layer)

model = Model(inputs=[ip_shape1, ip_shape2], outputs=predictions)

Thanks in advance.

like image 224
Mahek Shah Avatar asked Apr 17 '18 10:04

Mahek Shah


People also ask

How do I share weights in Tensorflow?

Tensorflow has an insight called scope, which lets you create and share weights across that scope. You just need to use the same name for the scope of different layers. You will have to put reuse=True to share and use the same weights. If you don't put reuse=True, you are going to get an error of duplicates.

What is dense () Keras?

Dense implements the operation: output = activation(dot(input, kernel) + bias) where activation is the element-wise activation function passed as the activation argument, kernel is a weights matrix created by the layer, and bias is a bias vector created by the layer (only applicable if use_bias is True ).

What is densely connected nn?

In any neural network, a dense layer is a layer that is deeply connected with its preceding layer which means the neurons of the layer are connected to every neuron of its preceding layer. This layer is the most commonly used layer in artificial neural network networks.


1 Answers

This uses the same layer for both sides. (Weighs and bias are shared)

ip_shape1 = Input(shape=(5,))
ip_shape2 = Input(shape=(5,))

dense = Dense(1, activation = "sigmoid", kernel_initializer = "ones")

op1 = dense(ip_shape1)
op2 = dense(ip_shape2)

merge_layer = Concatenate()([op1, op2])
predictions = Dense(1, activation='sigmoid')(merge_layer)

model = Model(inputs=[ip_shape1, ip_shape2], outputs=predictions)
like image 198
Daniel Möller Avatar answered Sep 17 '22 22:09

Daniel Möller