I am wondering whether I can simply apply dropout to convolutions in TensorFlow. How will it be applied? Are weights of the convolution mask randomly set to zero while it 'slides' over the input?
You can apply dropout on arbitrary input tensors. How this input was computed doesn't matter; each element of the input will simply either be kept (and scaled, see below) or set to zero.
From https://www.tensorflow.org/api_docs/python/tf/nn/dropout:
With probability
keep_prob
, outputs the input element scaled up by1 / keep_prob
, otherwise outputs0
. The scaling is so that the expected sum is unchanged.By default, each element is kept or dropped independently.
For example:
conv = tf.nn.conv2d(...)
drop = tf.nn.dropout(conv, keep_prob=0.5)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With