I'm using Caffe for classifying non-image data using a quite simple CNN structure. I've had no problems training my network on my HDF5-data with dimensions n x 1 x 156 x 12. However, I'm having difficulties classifying new data.
How do I do a simple forward pass without any preprocessing? My data has been normalized and have correct dimensions for Caffe (it's already been used to train the net). Below is my code and the CNN structure.
EDIT: I've isolated the problem to the function '_Net_forward' in pycaffe.py and found that the issue arises as the self.input dict is empty. Can anyone explain why that is? The set is supposed to be equal to the set coming from the new test data:
if set(kwargs.keys()) != set(self.inputs):
raise Exception('Input blob arguments do not match net inputs.')
My code has changed a bit as I now use the IO methods for converting the data into datum (see below). In that way I've filled the kwargs variable with the correct data.
Even small hints would be greatly appreciated!
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
# Make sure that caffe is on the python path:
caffe_root = '' # this file is expected to be run from {caffe_root}
import sys
sys.path.insert(0, caffe_root + 'python')
import caffe
import os
import subprocess
import h5py
import shutil
import tempfile
import sklearn
import sklearn.datasets
import sklearn.linear_model
import skimage.io
def LoadFromHDF5(dataset='test_reduced.h5', path='Bjarke/hdf5_classification/data/'):
f = h5py.File(path + dataset, 'r')
dat = f['data'][:]
f.close()
return dat;
def runModelPython():
model_file = 'Bjarke/hdf5_classification/conv_v2_simple.prototxt'
pretrained = 'Bjarke/hdf5_classification/data/train_iter_10000.caffemodel'
test_data = LoadFromHDF5()
net = caffe.Net(model_file, pretrained)
caffe.set_mode_cpu()
caffe.set_phase_test()
user = test_data[0,:,:,:]
datum = caffe.io.array_to_datum(user.astype(np.uint8))
user_dat = caffe.io.datum_to_array(datum)
user_dat = user_dat.astype(np.uint8)
out = net.forward_all(data=np.asarray([user_dat]))
if __name__ == '__main__':
runModelPython()
CNN Prototext
name: "CDR-CNN"
layers {
name: "data"
type: HDF5_DATA
top: "data"
top: "label"
hdf5_data_param {
source: "Bjarke/hdf5_classification/data/train.txt"
batch_size: 10
}
include: { phase: TRAIN }
}
layers {
name: "data"
type: HDF5_DATA
top: "data"
top: "label"
hdf5_data_param {
source: "Bjarke/hdf5_classification/data/test.txt"
batch_size: 10
}
include: { phase: TEST }
}
layers {
name: "feature_conv"
type: CONVOLUTION
bottom: "data"
top: "feature_conv"
blobs_lr: 1
blobs_lr: 2
convolution_param {
num_output: 10
kernel_w: 12
kernel_h: 1
stride_w: 1
stride_h: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
}
}
}
layers {
name: "conv1"
type: CONVOLUTION
bottom: "feature_conv"
top: "conv1"
blobs_lr: 1
blobs_lr: 2
convolution_param {
num_output: 14
kernel_w: 1
kernel_h: 4
stride_w: 1
stride_h: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
}
}
}
layers {
name: "pool1"
type: POOLING
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_w: 1
kernel_h: 3
stride_w: 1
stride_h: 3
}
}
layers {
name: "conv2"
type: CONVOLUTION
bottom: "pool1"
top: "conv2"
blobs_lr: 1
blobs_lr: 2
convolution_param {
num_output: 120
kernel_w: 1
kernel_h: 5
stride_w: 1
stride_h: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
}
}
}
layers {
name: "fc1"
type: INNER_PRODUCT
bottom: "conv2"
top: "fc1"
blobs_lr: 1
blobs_lr: 2
weight_decay: 1
weight_decay: 0
inner_product_param {
num_output: 84
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layers {
name: "accuracy"
type: ACCURACY
bottom: "fc1"
bottom: "label"
top: "accuracy"
include: { phase: TEST }
}
layers {
name: "loss"
type: SOFTMAX_LOSS
bottom: "fc1"
bottom: "label"
top: "loss"
}
Here is the answer from Evan Shelhamer I got on the Caffe Google Groups:
self._inputs
is indeed for the manual or "deploy" inputs as defined by the input fields in a prototxt. To run a net with data layers in through pycaffe, just callnet.forward()
without arguments. No need to change the definition of your train or test nets.See for instance code cell [10] of the Python LeNet example.
In fact I think it's clearer in the Instant Recognition with Caffe tutorial, cell 6:
# Feed in the image (with some preprocessing) and classify with a forward pass.
net.blobs['data'].data[...] = transformer.preprocess('data', caffe.io.load_image(caffe_root + 'examples/images/cat.jpg'))
out = net.forward()
print("Predicted class is #{}.".format(out['prob'].argmax()))
In other words, to generate the predicted outputs as well as their probabilities using pycaffe, once you have trained your model, you have to first feed the data layer with your input, then perform a forward pass with net.forward()
.
Alternatively, as pointed out in other answers, you can use a deploy prototxt that is similar to the one you use to define the trained network but removing the input and output layers, and add the following at the beginning (obviously adapting according to your input dimension):
name: "your_net"
input: "data"
input_dim: 1
input_dim: 1
input_dim: 1
input_dim: 250
That's what they use in the CIFAR10 tutorial.
(pycaffe really ought to be better documented…)
Only due to my own experimental experience, it's not a very good idea to specify train and test net in one file using {PHASE} clause. I got many weird errors when I used net file like that, but when I used older version of net files which contain two files separately, train and test, it worked. However I was using caffe version in Nov 2014, perhaps there's some bug or compatible issues there.
Well, when the model is used for prediction, shouldn't there be a deploy file specifying the net structure? If you look at ImageNet you should find imagenet_deploy.prototxt there. Although deploy file is similar to train/test file, I heard it's a bit different due to some fillers. I don't know if it's the problem, but any discussion is welcome, I need to learn new caffe schema if there exist too
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With