In the AlexNet implementation in caffe, I saw the following layer in the deploy.prototxt file:
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
Now the key idea of dropout is to randomly drop units (along with their connections) from the neural network during training.
Does this mean that I can simply delete this layer from deploy.prototxt, as this file is meant to be used during testing only?
Yes. Dropout is not required during Testing.
Even if you include a dropout layer, nothing special happens during Testing. See the source code of dropout forward pass:
if (this->phase_ == TRAIN) {
// Code to do something
} else {
caffe_copy(bottom[0]->count(), bottom_data, top_data); //Code to copy bottom blob to top blob
}
As seen in the source code, the bottom blob data is copied to top blob data memory if its not on Training phase.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With