Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

When to use in-place layers in Caffe?

Tags:

By setting the bottom and the top blob to be the same we can tell Caffe to do "in-place" computation to preserve memory consumption.

Currently I know I can safely use in-place "BatchNorm", "Scale" and "ReLU" layers (please let me know if I'm wrong). While it seems to have some issues for other layers (this issue seems to be an example).

When to use in-place layers in Caffe?
How does it work with back-propagation?

like image 432
dontloo Avatar asked Jul 20 '16 07:07

dontloo


1 Answers

As you well noted, in-place layers don't usually work "out of the box".
For some layers, it is quite trivial ("ReLU" and other neuron activation layers).
However, for others it requires special handling in code. For example, the implementation of "PReLU" layer has specific cache bottom_memory_ member variable that stores information needed for backprop.
You can see similar code for other layers that specifically test for if (top[0] == bottom[0]) to see if the layer is used in an "in-place" case.

Moreover, it makes little sense to have an in-place layer for which the input and output are of different shapes, thus layers such as "Convolution", "InnerProduct", "Pool" are not considered as candidates for "in-place" layers.

like image 69
Shai Avatar answered Oct 12 '22 00:10

Shai