I am trying to perform L2 normalization in Caffe for a layer. The idea is sort of to use these L2 normalized fc7 features in contrastive loss like http://www.cs.cornell.edu/~kb/publications/SIG15ProductNet.pdf .
I could find some links where people posted there code for L2 normalization layer. However I was wondering if it's possible to do using Local Response Normalization layer of Caffe or possibly any other.
I have a final fc vector of 1x2048 (2048 channels of size 1x1). Can someone please guide me about this?
You can perform L2 normalization in Caffe using a combination of simple layers:
layer {
name: "denom"
type: "Reduction"
bottom: "loss"
top: "denom"
reduction_param {
operation: SUMSQ
axis: 1
}
}
layer {
name: "power"
type: "Power"
bottom: "denom"
top: "power"
power_param {
power: -0.5
shift: 9.99999996004e-13
}
}
layer {
name: "reshape"
type: "Reshape"
bottom: "power"
top: "reshape"
reshape_param {
shape {
dim: 1
}
axis: -1
num_axes: 0
}
}
layer {
name: "tile"
type: "Tile"
bottom: "reshape"
top: "tile"
tile_param {
axis: 1
tiles: 300
}
}
layer {
name: "elwise"
type: "Eltwise"
bottom: "loss"
bottom: "tile"
top: "elwise"
eltwise_param {
operation: PROD
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With