Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Min-Max normalization Layer in Caffe

I am newbie in caffe, I am trying to normalize the convolution output between 0 to 1 with Min-Max Normalization.

Out = X - Xmin / (Xmax - Xmin)

I have checked many layers (Power, Scale, Batch Normalization, MVN) but no one is giving me min-max Normalization output in Layers. Can any one Help me ??

************* my prototxt *****************

name: "normalizationCheck"
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param { shape: { dim: 1 dim: 1 dim: 512 dim: 512 } }
}

layer {
  name: "normalize1"
  type: "Power"
  bottom: "data"
  top: "normalize1"
  power_param { 
    shift: 0
    scale: 0.00392156862
    power: 1
   }
}

layer {
    bottom: "normalize1"
    top: "Output"
    name: "conv1"
    type: "Convolution"
    convolution_param {
        num_output: 1
        kernel_size: 1
        pad: 0
        stride: 1
        bias_term: false
        weight_filler {
        type: "constant"
        value: 1
        }
    }
}

The convolution layer output is not in Normalized form I want Min-Max Normalized output in Layer format. Manually I can do using code but I need in Layers. Thanks

like image 842
AnkitSahu Avatar asked Dec 26 '16 07:12

AnkitSahu


1 Answers

You can write your own c++ layer following these guidelines, you'll see how to implement "forward only" layers in that page.

Alternatively, you can implement the layer in python and execute it in caffe via a '"Python"' layer:

First, implement your layer in python, store it in '/path/to/my_min_max_layer.py':

import caffe
import numpy as np

class min_max_forward_layer(caffe.Layer):
  def setup(self, bottom, top):
    # make sure only one input and one output
    assert len(bottom)==1 and len(top)==1, "min_max_layer expects a single input and a single output"

  def reshape(self, bottom, top):
    # reshape output to be identical to input
    top[0].reshape(*bottom[0].data.shape)

  def forward(self, bottom, top):
    # YOUR IMPLEMENTATION HERE!!
    in_ = np.array(bottom[0].data)
    x_min = in_.min()
    x_max = in_.max()
    top[0].data[...] = (in_-x_min)/(x_max-x_min)

  def backward(self, top, propagate_down, bottom):
    # backward pass is not implemented!
    pass

Once you have the layer implemented in python, you can simply add it to your net (make sure '/path/to' is in your $PYTHONPATH):

layer {
  name: "my_min_max_forward_layer"
  type: "Python"
  bottom: "name_your_input_here"
  top: "name_your_output_here"
  python_param {
    module: "my_min_max_layer"  # name of python file to be imported
    layer: "min_max_forward_layer" # name of layer class
  }
}
like image 85
Shai Avatar answered Oct 30 '22 23:10

Shai