Aux_input = Input(shape=(wrd_temp.shape[1],1), dtype='float32')#shape (,200)
Main_input = Input(shape=(wrdvec.shape[1],),dtype='float32')#shape(,367)
X = Bidirectional(LSTM(20,return_sequences=True))(Aux_input)
X = Dropout(0.2)(X)
X = Bidirectional(LSTM(28,return_sequences=True))(X)
X = Dropout(0.2)(X)
X = Bidirectional(LSTM(28,return_sequences=False))(X)
Aux_Output = Dense(Opt_train.shape[1], activation= 'softmax' )(X)#total 22 classes
x = keras.layers.concatenate([Main_input,Aux_Output],axis=1)
x = tf.reshape(x,[1,389,1])#here 389 is the shape of the new input i.e.(
Main_input+Aux_Output)
x = Bidirectional(LSTM(20,return_sequences=True))(x)
x = Dropout(0.2)(x)
x = Bidirectional(LSTM(28,return_sequences=True))(x)
x = Dropout(0.2)(x)
x = Bidirectional(LSTM(28,return_sequences=False))(x)
Main_Output = Dense(Opt_train.shape[1], activation= 'softmax' )(x)
model = Model(inputs=[Aux_input,Main_input], outputs= [Aux_Output,Main_Output])
Error occurs in line declaring the model i.e. model = Model(), here the attribute error has occurred, Also if there is any other mistake in my implementation please do take a not and notify me in the comment section.
The problem lied in the fact that using every tf operation should be encapsulated by either:
keras.backend functions,Lambda layers,keras functions with the same behavior.When you are using tf operation - you are getting tf tensor object which doesn't have history field. When you use keras functions you will get keras.tensors.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With