I am working on a fuzzy convolution filter for CNNs. I have the function ready - it takes in the 2D input matrix and the 2D kernel/weight matrix. The function outputs the convolved feature or the activation map.
Now, I want to use Keras to build the rest of the CNN that will have the standard 2D convolution filters too.
Is there any way I can insert my custom filter into the Keras model in such a way that the kernel matrix is updated by the built in libraries of the Keras backend? Alternatively is there any library that I can use to update the kernel with every iteration?
Suppose we want to apply a 3x3
custom filter onto an 6x6
image.
Necessary import
import keras.backend as K
import numpy as np
from keras import Input, layers
from keras.models import Model
Definition of the custom filter
# custom filter
def my_filter(shape, dtype=None):
f = np.array([
[[[1]], [[0]], [[-1]]],
[[[1]], [[0]], [[-1]]],
[[[1]], [[0]], [[-1]]]
])
assert f.shape == shape
return K.variable(f, dtype='float32')
Dummy example input image (it is 1
channel image. So dimension will be 6x6x1
. Here, pixel values are random integer. Generally pixel values should be 0 to 255
or 0.0 to 1.0
.)
input_mat = np.array([
[ [4], [9], [2], [5], [8], [3] ],
[ [3], [6], [2], [4], [0], [3] ],
[ [2], [4], [5], [4], [5], [2] ],
[ [5], [6], [5], [4], [7], [8] ],
[ [5], [7], [7], [9], [2], [1] ],
[ [5], [8], [5], [3], [8], [4] ]
])
# we need to give the batch size.
# here we will just add a dimension at the beginning which makes batch size=1
input_mat = input_mat.reshape((1, 6, 6, 1))
Dummy conv model where we will use our custom filter
def build_model():
input_tensor = Input(shape=(6,6,1))
x = layers.Conv2D(filters=1,
kernel_size = 3,
kernel_initializer=my_filter,
strides=2,
padding='valid') (input_tensor)
model = Model(inputs=input_tensor, outputs=x)
return model
Testing
model = build_model()
out = model.predict(input_mat)
print(out)
Output
[[[[ 0.]
[-4.]]
[[-5.]
[ 3.]]]]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With