I am newbie in caffe, I am trying to normalize the convolution output between 0 to 1 with Min-Max Normalization.
Out = X - Xmin / (Xmax - Xmin)
I have checked many layers (Power, Scale, Batch Normalization, MVN) but no one is giving me min-max Normalization output in Layers. Can any one Help me ??
************* my prototxt *****************
name: "normalizationCheck"
layer {
name: "data"
type: "Input"
top: "data"
input_param { shape: { dim: 1 dim: 1 dim: 512 dim: 512 } }
}
layer {
name: "normalize1"
type: "Power"
bottom: "data"
top: "normalize1"
power_param {
shift: 0
scale: 0.00392156862
power: 1
}
}
layer {
bottom: "normalize1"
top: "Output"
name: "conv1"
type: "Convolution"
convolution_param {
num_output: 1
kernel_size: 1
pad: 0
stride: 1
bias_term: false
weight_filler {
type: "constant"
value: 1
}
}
}
The convolution layer output is not in Normalized form I want Min-Max Normalized output in Layer format. Manually I can do using code but I need in Layers. Thanks
You can write your own c++ layer following these guidelines, you'll see how to implement "forward only" layers in that page.
Alternatively, you can implement the layer in python and execute it in caffe via a '"Python"' layer:
First, implement your layer in python, store it in '/path/to/my_min_max_layer.py'
:
import caffe
import numpy as np
class min_max_forward_layer(caffe.Layer):
def setup(self, bottom, top):
# make sure only one input and one output
assert len(bottom)==1 and len(top)==1, "min_max_layer expects a single input and a single output"
def reshape(self, bottom, top):
# reshape output to be identical to input
top[0].reshape(*bottom[0].data.shape)
def forward(self, bottom, top):
# YOUR IMPLEMENTATION HERE!!
in_ = np.array(bottom[0].data)
x_min = in_.min()
x_max = in_.max()
top[0].data[...] = (in_-x_min)/(x_max-x_min)
def backward(self, top, propagate_down, bottom):
# backward pass is not implemented!
pass
Once you have the layer implemented in python, you can simply add it to your net (make sure '/path/to'
is in your $PYTHONPATH
):
layer {
name: "my_min_max_forward_layer"
type: "Python"
bottom: "name_your_input_here"
top: "name_your_output_here"
python_param {
module: "my_min_max_layer" # name of python file to be imported
layer: "min_max_forward_layer" # name of layer class
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With