Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenCV FeatureDetector

I am trying to write a code that applies SURF object detection, so I took one of the openCV samples ( sample 3 ) and I started updating the onCameraViewStarted() and onCameraFrame() methods but I keep getting a runtime error when I try it on my galaxy S3 phone and I couldn't find anything to help with my problem here is my code and what I updated:

public class Sample3Native extends Activity implements CvCameraViewListener{

private static final String TAG = "OCVSample::Activity";

private Mat                    mRgba;
private Mat                    mGrayMat;
private CameraBridgeViewBase   mOpenCvCameraView;

Mat descriptors ;           
List<Mat> descriptorsList;

FeatureDetector featureDetector;
MatOfKeyPoint keyPoints;
DescriptorExtractor descriptorExtractor;
DescriptorMatcher descriptorMatcher;**


private BaseLoaderCallback     mLoaderCallback = new BaseLoaderCallback(this) {
    @Override
    public void onManagerConnected(int status) {
        switch (status) {
            case LoaderCallbackInterface.SUCCESS:
            {
                Log.i(TAG, "OpenCV loaded successfully");

                // Load native library after(!) OpenCV initialization
                System.loadLibrary("native_sample");

                mOpenCvCameraView.enableView();
            } break;
            default:
            {
                super.onManagerConnected(status);
            } break;
        }
    }
};

public void onCameraViewStarted(int width, int height) {
    mRgba = new Mat(height, width, CvType.CV_8UC4);
    mGrayMat = new Mat(height, width, CvType.CV_8UC1);
    featureDetector=FeatureDetector.create(4); // SURF= 4;
    descriptorExtractor=DescriptorExtractor.create(2);//SURF = 2
    descriptorMatcher=DescriptorMatcher.create(6); //BRUTEFORCE_SL2 = 6**

}

 public Mat onCameraFrame(Mat inputFrame) {
    inputFrame.copyTo(mRgba);
    //detect_1(0, mRgba.getNativeObjAddr(), keyPoints.getNativeObjAddr());
    //Now mRgba contains the current frame ( start manipulation part)
    //detecting keypoints
    featureDetector.detect(mRgba, keyPoints);
    //draw keypoints
   // Features2d.drawKeypoints(mRgba, keyPoints, mRgba);
    //finding descriptors
    descriptorExtractor.compute(mRgba, keyPoints, descriptors);
    //Matcher between 2 images or set of images
    // Note: training set and query set are handled here! (in matcher)
   //descriptorsList = descriptorMatcher.getTrainDescriptors();
    //descriptorsList.add(descriptors);
   // descriptorMatcher.add(descriptorsList);

    //Imgproc.cvtColor(mRgba, mGrayMat, Imgproc.COLOR_RGBA2GRAY);
    //FindFeatures(mGrayMat.getNativeObjAddr(), mRgba.getNativeObjAddr());

    return mRgba;
}
}

Note: I have tried commenting everything but the featureDetector.detect(mRgba, keyPoints) in the onCameraFrame() method and still gave runtime error on my phone.

like image 835
user2083842 Avatar asked Feb 18 '13 15:02

user2083842


2 Answers

If I'm not mistaken, OpenCV SURF Feature Detector only works with grayscale images. So try to add this after your call to copyTo in the onCameraFrame() method:

cvtColor(mRgba, mGrayMat, COLOR_RGBA2GRAY);
like image 94
cid Avatar answered Oct 08 '22 08:10

cid


Are you sure that you use SIFT in a correct way? As far as I know, SIFT and SURF are not include in the distribution package of OpenCV Android. To use them, you need to compile the nonfree module and use it in your project. So, what you need to do is to create a NDK project, compile the nonfree module as a standalone library. Then use this library to compile your program. Then you should be able to build your application. You can refer to this tutorial.

After you get the jni library, you can easily wrap it to a JAVA JNI interface. Then you should be able to use the JAVA interface in your Android application.

like image 33
Robert Wang Avatar answered Oct 08 '22 09:10

Robert Wang