I'm trying to make a multiple input model as follow but am having trouble defining the following:
I want to build something like this:
-First Dense Layer-      - First Dense layer -
         |                        |
         |                        |
Second Dense layer          Second Dense layer
                      |
                      |
            Final Dense layer (Single Output)
However I get the following error when running my model:
AttributeError: 'Concatenate' object has no attribute 'shape'
def build_nn_model(x_input1_train, x_input2_train):
    
    """
    Creates the a multi-channel ANN, capable of accepting multiple inputs.
    :param: none
    :return: the model of the ANN with a single output given
    """
    x_input1= np.expand_dims(x_input1,1)
    # define two sets of inputs for models
    input1= Input(shape = (x_input1.shape[1], 1))
    input2= Input(shape = (x_input2.shape[1], 1))
    # The first branch operates on the first input
    x = Dense(units = 128, activation="relu")(input1)
    x = BatchNormalization()(x)
    
    x = Dense(units = 128, activation="relu")(x)
    x =Flatten()(x)
    x = BatchNormalization()(x)  
    
    x = Model(inputs=input1, outputs=x)
    # The second branch operates on the second input
    y = Dense(units = 128, activation="relu")(input2)
    y = BatchNormalization()(y)
    
    y = Dense(units = 128, activation="relu")(y)
    y =Flatten()(y)
    y = BatchNormalization()(y)  
    
    y = Model(inputs=inp_embeddings, outputs=y)
    
    # combine the output of the two branches
    combined = Concatenate([x.output, y.output])
    
    # Apply a FC layer and then a regression activation on the combined outputs
    #z = Dense(2, activation="relu")(combined)
    #z = Dense(1, activation="linear")(z)
    
    outputs = Dense(128, activation='relu')(combined)
    #out = Dropout(0.5)(out)
    outputs = Dense(1)(out)
    # The model will accept the inputs of the two branches and then output a single value
    model = Model(inputs = [x.input, y.input], outputs = out)
    #model = Model(inputs=[x.input, y.input], outputs=z)
    # Compile the NN
    model.compile(loss='mse', optimizer = Adam(lr = 0.001), metrics = ['mse'])
    # ANN Summary
    model.summary()
    
    return model
Input1:
array([55., 46., 46., ..., 60., 60., 45.])
Shape: (2400,)
Input2:
array([[-2.00370455, -2.35689664, -1.96147382, ...,  2.11014128,
         2.59383321,  1.24209607],
       [-1.97130549, -2.19063663, -2.02996445, ...,  2.32125568,
         2.27316046,  1.48600614],
       [-2.01526666, -2.40440917, -1.94321752, ...,  2.15266657,
         2.68460488,  1.23534095],
       ...,
       [-2.1359458 , -2.52428007, -1.75701785, ...,  2.25480819,
         2.68114281,  1.75468981],
       [-1.95868206, -2.23297167, -1.96401751, ...,  2.07427239,
         2.60306072,  1.28556955],
       [-1.80507278, -2.62199521, -2.08697271, ...,  2.34080577,
         2.48254585,  1.52028871]])>
Shape: (2400, 3840)
you need to add the brackets to the Concatenate layer. it's Concatenate()([x.output, y.output])
you can also write your model without the usage of flatten operation. your data are 2D so you don't need to do strange manipulations. you need to use the flatten to pass from 3D (or bigger dimension) to 2D but in your case, you can start from 2D without problems
here a full example
n_sample = 2400
X1 = np.random.uniform(0,1, (n_sample,))  # (2400,)
X2 = np.random.uniform(0,1, (n_sample,3840))  # (2400,3840)
Y = np.random.uniform(0,1, (n_sample,))  # (2400,)
input1= Input(shape = (1, ))
input2= Input(shape = (3840, ))
# The first branch operates on the first input
x = Dense(units = 128, activation="relu")(input1)
x = BatchNormalization()(x)
x = Dense(units = 128, activation="relu")(x)
x = BatchNormalization()(x)
x = Model(inputs=input1, outputs=x)
# The second branch operates on the second input (Protein Embeddings)
y = Dense(units = 128, activation="relu")(input2)
y = BatchNormalization()(y)
y = Dense(units = 128, activation="relu")(y)
y = BatchNormalization()(y)  
y = Model(inputs=input2, outputs=y)
# combine the output of the two branches
combined = Concatenate()([x.output, y.output])
out = Dense(128, activation='relu')(combined)
out = Dropout(0.5)(out)
out = Dense(1)(out)
# The model will accept the inputs of the two branches and then output a single value
model = Model(inputs = [x.input, y.input], outputs = out)
model.compile(loss='mse', optimizer = Adam(lr = 0.001), metrics = ['mse'])
model.fit([X1,X2], Y, epochs=3)
here the notebook
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With