I want to take use of tensorflow to implement fully convolutional network. There is a function
tf.nn.conv2d_transpose(value, filter, output_shape, strides, padding, name),
which could be used to take bilinear upsampling. However, I am confused as to how to use it? The input is an image with a single channel, and the output is also an image with a single channel, whose size is two times the one of the input.
I tried to use the function as follows, but got an IndexError: list index out of range
:
with tf.name_scope('deconv') as scope:
deconv = tf.nn.conv2d_transpose(conv6, [3, 3, 1, 1],
[1, 26, 20, 1], 2, padding='SAME', name=None)
Deconvolutional networks are convolutional neural networks (CNN) that work in a reversed process. Deconvolutional networks, also known as deconvolutional neural networks, are very similar in nature to CNNs run in reverse but are a distinct application of artificial intelligence (AI).
Transposed convolution is also known as Deconvolution which is not appropriate as deconvolution implies removing the effect of convolution which we are not aiming to achieve. It is also known as upsampled convolution which is intuitive to the task it is used to perform, i.e upsample the input feature map.
While convolution without padding results in a smaller sized output, deconvolution increases the output size. With stride values greater than 1, deconvolution can be used as a way of up sampling the data stream. This appears to be its main usage in deep learning.
A transposed convolutional layer attempts to reconstruct the spatial dimensions of the convolutional layer and reverses the downsampling and upsampling techniques applied to it. A deconvolution is a mathematical operation that reverses the process of a convolutional layer.
Got it! (assuming input_size = [1, 13, 10,1])
with tf.name_scope('deconv') as scope:
deconv = tf.nn.conv2d_transpose(input_layer, [3, 3, 1, 1],
[1, 26, 20, 1], [1, 2, 2, 1], padding='SAME', name=None)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With