Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to implement a neural network with a not-fully-connected layer as the final layer?

I would like to implement a neural network with an input layer, two dense hidden layer and a non-dense output layer. A toy example is shown in the figure below. The first hidden layer has three neurons, the second two and the final four neurons but between the second and third there are only four connections.

Network architecture

I would like to use Keras functional API. How can I implement it? Should I set the missing weight manually to 0? I would start as follows:

input=keras.layers.Input(...)
hidden1=keras.layers.Dense(3, activation="..")(input)
hidden2=keras.layers.Dense(3, activation="..")(hidden1)

but then I do not know how to proceed.

like image 324
Anne Avatar asked Mar 02 '23 02:03

Anne


1 Answers

The final layer is actually two separate Dense layers, each with 2 neurons and connected to a different neuron of previous layer. Therefore, you can simply separate the neurons of second-to-last layer and pass it to two different layers:

input = keras.layers.Input(shape=(3,))
hidden1 = keras.layers.Dense(3)(input)
hidden2 = keras.layers.Dense(2)(hidden1)
hidden2_n1 = keras.layers.Lambda(lambda x: x[:,0:1])(hidden2)  # take the first neuron
hidden2_n2 = keras.layers.Lambda(lambda x: x[:,1:])(hidden2)   # take the second neuron
output1 = keras.layers.Dense(2)(hidden2_n1)
output2 = keras.layers.Dense(2)(hidden2_n2)
output = keras.layers.concatenate([output1, output2])  # optional: concatenate the layers to have a single output layer

model = keras.models.Model(input, output)

In tf.keras or newer versions of keras, instead of using Lambda layers you could simply write:

output1 = keras.layers.Dense(2)(hidden2[:,0:1])
output2 = keras.layers.Dense(2)(hidden2[:,1:])
like image 181
today Avatar answered Mar 05 '23 15:03

today