I have a caffemodel file that contains layers that are not supported by ethereon's caffe-tensorflow conversion utility. I would like to generate a numpy representation of my caffemodel.
My question is, how do I convert a caffemodel file (I also have the prototxt, if that is useful) to a numpy file?
Additional info: I have python, caffe with python interfaces etc installed. I am clearly not experienced with caffe.
In Java, we can store the content of the file into an array either by reading the file using a scanner or bufferedReader or FileReader or by using readAllLines method.
Numpy arrays are more compact than python lists, which uses less memory. Numpy is also not just more efficient but convienient. There are a lot of vector and matrix operations in Numpy. There are also things built into Numpy such as FFT's, convolutions, statistics, histograms, etc.
Here's a nice function that converts a caffe net to a python list of dictionaries, so you can pickle it and read it anyway you want:
import caffe
def shai_net_to_py_readable(prototxt_filename, caffemodel_filename):
net = caffe.Net(prototxt_filename, caffemodel_filename, caffe.TEST) # read the net + weights
pynet_ = []
for li in xrange(len(net.layers)): # for each layer in the net
layer = {} # store layer's information
layer['name'] = net._layer_names[li]
# for each input to the layer (aka "bottom") store its name and shape
layer['bottoms'] = [(net._blob_names[bi], net.blobs[net._blob_names[bi]].data.shape)
for bi in list(net._bottom_ids(li))]
# for each output of the layer (aka "top") store its name and shape
layer['tops'] = [(net._blob_names[bi], net.blobs[net._blob_names[bi]].data.shape)
for bi in list(net._top_ids(li))]
layer['type'] = net.layers[li].type # type of the layer
# the internal parameters of the layer. not all layers has weights.
layer['weights'] = [net.layers[li].blobs[bi].data[...]
for bi in xrange(len(net.layers[li].blobs))]
pynet_.append(layer)
return pynet_
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With