I am trying to create my first ensemble models in keras. I have 3 input values and a single output value in my dataset.
from keras.optimizers import SGD,Adam
from keras.layers import Dense,Merge
from keras.models import Sequential
model1 = Sequential()
model1.add(Dense(3, input_dim=3, activation='relu'))
model1.add(Dense(2, activation='relu'))
model1.add(Dense(2, activation='tanh'))
model1.compile(loss='mse', optimizer='Adam', metrics=['accuracy'])
model2 = Sequential()
model2.add(Dense(3, input_dim=3, activation='linear'))
model2.add(Dense(4, activation='tanh'))
model2.add(Dense(3, activation='tanh'))
model2.compile(loss='mse', optimizer='SGD', metrics=['accuracy'])
model3 = Sequential()
model3.add(Merge([model1, model2], mode = 'concat'))
model3.add(Dense(1, activation='sigmoid'))
model3.compile(loss='binary_crossentropy', optimizer='Adam', metrics=['accuracy'])
model3.input_shape
The ensemble model(model3) compiles without any error but while fitting the model I have to pass the same input two times model3.fit([X,X],y)
. Which I think is an unnecessary step and instead of passing input twice I want to have a common input nodes for my ensemble model. How can I do it?
Creating a Sequential model. You can create a Sequential model by passing a list of layers to the Sequential constructor: model = keras.Sequential( [ layers.Dense(2, activation="relu"), layers.Dense(3, activation="relu"), layers.Dense(4), ] ) Its layers are accessible via the layers attribute: model.layers.
In Keras there is a helpful way to define a model: using the functional API. With functional API you can define a directed acyclic graphs of layers, which lets you build completely arbitrary architectures. Considering your example:
The weights are created when the model first sees some input data: model = keras.Sequential( [ layers.Dense(2, activation="relu"), layers.Dense(3, activation="relu"), layers.Dense(4), ] ) # No weights at this stage!
Once a Sequential model has been built, it behaves like a Functional API model. This means that every layer has an input and output attribute. These attributes can be used to do neat things, like quickly creating a model that extracts the outputs of all intermediate layers in a Sequential model:
Keras functional API seems to be a better fit for your use case, as it allows more flexibility in the computation graph. e.g.:
from keras.layers import concatenate
from keras.models import Model
from keras.layers import Input, Merge
from keras.layers.core import Dense
from keras.layers.merge import concatenate
# a single input layer
inputs = Input(shape=(3,))
# model 1
x1 = Dense(3, activation='relu')(inputs)
x1 = Dense(2, activation='relu')(x1)
x1 = Dense(2, activation='tanh')(x1)
# model 2
x2 = Dense(3, activation='linear')(inputs)
x2 = Dense(4, activation='tanh')(x2)
x2 = Dense(3, activation='tanh')(x2)
# merging models
x3 = concatenate([x1, x2])
# output layer
predictions = Dense(1, activation='sigmoid')(x3)
# generate a model from the layers above
model = Model(inputs=inputs, outputs=predictions)
model.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'])
# Always a good idea to verify it looks as you expect it to
# model.summary()
data = [[1,2,3], [1,1,3], [7,8,9], [5,8,10]]
labels = [0,0,1,1]
# The resulting model can be fit with a single input:
model.fit(data, labels, epochs=50)
Notes:
EDIT: updated notes based on comments
etov's answer is a great option.
But suppose you already have model1
and model2
ready and you don't want to change them, you can create the third model like this:
singleInput = Input((3,))
out1 = model1(singleInput)
out2 = model2(singleInput)
#....
#outN = modelN(singleInput)
out = Concatenate()([out1,out2]) #[out1,out2,...,outN]
out = Dense(1, activation='sigmoid')(out)
model3 = Model(singleInput,out)
And if you already have all the models ready and don't want to change them, you can have something like this (not tested):
singleInput = Input((3,))
output = model3([singleInput,singleInput])
singleModel = Model(singleInput,output)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With