Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

RNN : understanfingConcatenating layers

I am trying to understanding concatenating of layers in tensorflow keras. Below I have drew what I think is the concatenation of 2 RNN layers [ Spare for picture clarity] and the output

My model

Here I am trying to concatenate two RNN layers. One layer has longitudinal data[ integer valued ] of patients in some time sequence and other layer has again details of same patients of other time sequence with categorical input.

I don't want these two different time sequences to be mixed up since it is medical data. So I am trying this. But before that I want to be sure if what I have drawn is what concatenating of two layers means. Below is my code. It appears to work well but I want to confirm if my what i drew and what is implemented are correct .

#create simpleRNN with one sequence of input

first_input = Input(shape=(4, 7),dtype='float32')
simpleRNN1 = layers.SimpleRNN(units=25,bias_initializer= initializers.RandomNormal(stddev=0.0001),
                           activation="relu",kernel_initializer= "random_uniform")(first_input)

#another layer of RNN
second_input = Input(shape=(16,1),dtype='float32')
simpleRNN2 = layers.SimpleRNN(units=25,bias_initializer= initializers.RandomNormal(stddev=0.0001),
                           activation="relu",kernel_initializer= "random_uniform")(second_input)

#concatenate two layers,stack dense layer on top
concat_lay = tf.keras.layers.Concatenate()([simpleRNN1, simpleRNN2]) 
dens_lay = layers.Dense(64, activation='relu')(concat_lay)
dens_lay = layers.Dense(32, activation='relu')(dens_lay)
dens_lay = layers.Dense(1, activation='sigmoid')(dens_lay)

model = tf.keras.Model(inputs=[first_input, second_input], outputs= [dens_lay])
model.compile(loss='binary_crossentropy', optimizer='adam',metrics=["accuracy"],lr=0.001)
model.summary()
like image 844
Naveen Gabriel Avatar asked Dec 29 '19 08:12

Naveen Gabriel


People also ask

How many layers are in RNN?

There are three built-in RNN layers in Keras: keras. layers.

Can RNN have multiple hidden layers?

Definitely you can have multiple hidden layers in RNN. One the most common approaches to determine the hidden units is to start with a very small network (one hidden unit) and apply the K-fold cross validation ( k over 30 will give very good accuracy) and estimate the average prediction risk.

How many hidden layers are there in RNN?

Multilayer RNNs generalize both feed-forward neural nets and one-hidden-layer RNNs. Deep learning has arguably achieved tremendous success in recent years. In simple words, deep learning uses the composition of many nonlinear functions to model the complex dependency between input features and labels.

What is RNN hidden layer?

Basically, the RNN layer is comprised of a single rolled RNN cell that unrolls according to the “number of steps” value (number of time steps/segments) you provide. As we mentioned earlier the main speciality in RNNs is the ability to model short term dependencies. This is due to the hidden state in the RNN.


1 Answers

Concatenation means 'chaining together' or 'unification' here, making a union of two enities.

i think your problem is addressed in https://datascience.stackexchange.com/questions/29634/how-to-combine-categorical-and-continuous-input-features-for-neural-network-trai (How to combine categorical and continuous input features for neural network training)

If you have biomedical data, i.e. ECG, as the continuous data and diagnoses as the categorical data i would consider ensemble learning as the best ansatz.

What is the best solution here depends on the details of your problem ...

Building an ensembleof two neural nets is described in https://machinelearningmastery.com/ensemble-methods-for-deep-learning-neural-networks/

like image 180
ralf htp Avatar answered Oct 05 '22 01:10

ralf htp