I am trying to implement AlexNet with Keras and was inspecting the network design in MATLAB which is given as follows
As could be seen, the second convolution layer has 256 filters of size 5x5, 48 channels and a padding of [ 2 2 2 2 ]. How could I specify padding
of [ 2 2 2 2] with Keras? I went through the documentation of Conv2D. It accepts only 2 values for padding namely valid
and same
. I could not understand this. For what I know, valid
would mean zero padding. How could I specify [2 2 2 2] padding with the second convolution layer? I created the first layer as:
model.add(keras.layers.Conv2D(filters = 96, kernel_size = (11,11),
strides = (4,4), padding = "valid", input_shape=(227,227,3)))
Also, since in the second layer there are 48 channels, do I need to be explicit about it?
A specific padding isn't specified in Conv2D
but instead a ZeroPadding2D
layer.
valid
and same
are really just shorthands for common paddings - valid
means that you don't pad the input and same
means you add padding such that the output length is the same as the input length.
In your case if you wanted to add a specific padding of size 2:
model.add(keras.layers.ZeroPadding2D(padding=(2, 2)))
model.add(keras.layers.Conv2D(filters = 96, kernel_size = (11,11), strides = (4,4), padding = "valid"))
I would also strongly suggest checking out this keras implementation of alexnet. Note that you can also find docs for padding layers in the keras convolutional docs (it's all the way at the bottom).
You got valid
padding right, please notice that width and height will be smaller after layer with this parameter.
Padding same
on the other hand means that specific padding size will be used to ensure image dimensions will not change.
For your specific case, if you pad input image with 2
pixels on each side you will get exactly same size of image as output from the layer. So specifying same
will perform exact same padding as [2 2 2 2]
.
If you want formula for calculating output size after convolutional layer check first answer to this Quora question.
I have rarely (if at all) seen different padding schemes so those usually suffice.
BTW. All layers in AlexNet use padding same
except the first one (as correctly pointed out in the comments to another answer).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With