I have a dataset of images I want to input to a Convolutional Neural Network Model, however, with each of these images, there is a range or distance from the object associated with the image.
I want to input this range as an additional piece of context for the CNN model.
Does providing this extra piece of information provide any benefit? Does it make sense to do? Is it feasible in Keras?
Thanks!
The stacking of convolutional layers allows a hierarchical decomposition of the input. Consider that the filters that operate directly on the raw pixel values will learn to extract low-level features, such as lines.
2019). Overfitting can be tackled by decreasing the adjustable parameters and/or increasing the number of training data lines, whereas underfitting can be addressed by improving the complexity of the learning model (Andrei Dmitri et al.
You have a few options here, one is to encode your numerical value as a feature plane in your input. If your numerical value is c
you can add a channel to each input image with the value c
at every pixel.
Another option is to merge the value in as an additional input to the fully connected layer.
In keras this would look something like:
conv = Sequential()
conv.add(Conv2D(32, kernel_size=(3, 3), strides=(1, 1),
activation='relu',
input_shape=input_shape))
conv.add(MaxPooling2D(pool_size=(2, 2)))
conv.add(Flatten())
conv.add(Dense(512, activation='relu'))
range_ = Sequential()
range_.add(Dense(1, input_shape=(1,), activation='relu'))
merged = Concatenate([conv, range_])
merged.add(Dense(n_classes, activation='softmax'))
merged.compile(optimizer='adam', loss='binary_crossentropy',
metrics=['accuracy'])
Which option you choose will depend on your data and whether you think the numerical feature will help the convolutional layers better understand the input, or whether you think it's not needed until later. If you have time you can try both architectures and see which performs better.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With