I want to build a neural network where the two first layers are feedforward and the last one is recurrent. here is my code :
model = Sequential()
model.add(Dense(150, input_dim=23,init='normal',activation='relu'))
model.add(Dense(80,activation='relu',init='normal'))
model.add(SimpleRNN(2,init='normal'))
adam =OP.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model.compile(loss="mean_squared_error", optimizer="rmsprop")
and I get this error :
Exception: Input 0 is incompatible with layer simplernn_11: expected ndim=3, found ndim=2.
model.compile(loss='mse', optimizer=adam)
What is a Dense Layer? In any neural network, a dense layer is a layer that is deeply connected with its preceding layer which means the neurons of the layer are connected to every neuron of its preceding layer. This layer is the most commonly used layer in artificial neural network networks.
Dense layer, also called fully-connected layer, refers to the layer whose inside neurons connect to every neuron in the preceding layer.
It is correct that in Keras, RNN layer expects input as (nb_samples, time_steps, input_dim)
. However, if you want to add RNN layer after a Dense layer, you still can do that after reshaping the input for the RNN layer. Reshape can be used both as a first layer and also as an intermediate layer in a sequential model. Examples are given below:
Reshape as first layer in a Sequential model
model = Sequential()
model.add(Reshape((3, 4), input_shape=(12,)))
# now: model.output_shape == (None, 3, 4)
# note: `None` is the batch dimension
Reshape as an intermediate layer in a Sequential model
model.add(Reshape((6, 2)))
# now: model.output_shape == (None, 6, 2)
For example, if you change your code in the following way, then there will be no error. I have checked it and the model compiled without any error reported. You can change the dimension as per your need.
from keras.models import Sequential
from keras.layers import Dense, SimpleRNN, Reshape
from keras.optimizers import Adam
model = Sequential()
model.add(Dense(150, input_dim=23,init='normal',activation='relu'))
model.add(Dense(80,activation='relu',init='normal'))
model.add(Reshape((1, 80)))
model.add(SimpleRNN(2,init='normal'))
adam = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model.compile(loss="mean_squared_error", optimizer="rmsprop")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With