I am trying to write a code that applies SURF object detection, so I took one of the openCV samples ( sample 3 ) and I started updating the onCameraViewStarted()
and onCameraFrame()
methods but I keep getting a runtime error when I try it on my galaxy S3 phone and I couldn't find anything to help with my problem here is my code and what I updated:
public class Sample3Native extends Activity implements CvCameraViewListener{
private static final String TAG = "OCVSample::Activity";
private Mat mRgba;
private Mat mGrayMat;
private CameraBridgeViewBase mOpenCvCameraView;
Mat descriptors ;
List<Mat> descriptorsList;
FeatureDetector featureDetector;
MatOfKeyPoint keyPoints;
DescriptorExtractor descriptorExtractor;
DescriptorMatcher descriptorMatcher;**
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
{
Log.i(TAG, "OpenCV loaded successfully");
// Load native library after(!) OpenCV initialization
System.loadLibrary("native_sample");
mOpenCvCameraView.enableView();
} break;
default:
{
super.onManagerConnected(status);
} break;
}
}
};
public void onCameraViewStarted(int width, int height) {
mRgba = new Mat(height, width, CvType.CV_8UC4);
mGrayMat = new Mat(height, width, CvType.CV_8UC1);
featureDetector=FeatureDetector.create(4); // SURF= 4;
descriptorExtractor=DescriptorExtractor.create(2);//SURF = 2
descriptorMatcher=DescriptorMatcher.create(6); //BRUTEFORCE_SL2 = 6**
}
public Mat onCameraFrame(Mat inputFrame) {
inputFrame.copyTo(mRgba);
//detect_1(0, mRgba.getNativeObjAddr(), keyPoints.getNativeObjAddr());
//Now mRgba contains the current frame ( start manipulation part)
//detecting keypoints
featureDetector.detect(mRgba, keyPoints);
//draw keypoints
// Features2d.drawKeypoints(mRgba, keyPoints, mRgba);
//finding descriptors
descriptorExtractor.compute(mRgba, keyPoints, descriptors);
//Matcher between 2 images or set of images
// Note: training set and query set are handled here! (in matcher)
//descriptorsList = descriptorMatcher.getTrainDescriptors();
//descriptorsList.add(descriptors);
// descriptorMatcher.add(descriptorsList);
//Imgproc.cvtColor(mRgba, mGrayMat, Imgproc.COLOR_RGBA2GRAY);
//FindFeatures(mGrayMat.getNativeObjAddr(), mRgba.getNativeObjAddr());
return mRgba;
}
}
Note: I have tried commenting everything but the featureDetector.detect(mRgba, keyPoints)
in the onCameraFrame()
method and still gave runtime error on my phone.
If I'm not mistaken, OpenCV SURF Feature Detector only works with grayscale images. So try to add this after your call to copyTo in the onCameraFrame()
method:
cvtColor(mRgba, mGrayMat, COLOR_RGBA2GRAY);
Are you sure that you use SIFT in a correct way? As far as I know, SIFT and SURF are not include in the distribution package of OpenCV Android. To use them, you need to compile the nonfree module and use it in your project. So, what you need to do is to create a NDK project, compile the nonfree module as a standalone library. Then use this library to compile your program. Then you should be able to build your application. You can refer to this tutorial.
After you get the jni library, you can easily wrap it to a JAVA JNI interface. Then you should be able to use the JAVA interface in your Android application.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With