I'm trying to draw a shape using the OpenCV lib in the native code (i.e. in the jni folder in an Android app) and call the native method to return the image to Java code and display it.
How it can be done? I can't find a simple application for Android using OpenCV. I set up the environment too and the samples given for Android using OpenCV like cvcamera, calibrations are running fine. They used SWIG for the JNI interface. Is SWIG mandatory? I didn't use SWIG. I tried a lot and am still trying.
In Java, the code is like this:
public class HelloJni extends Activity
{
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(new ShapeView(this));
}
static {
System.loadLibrary("hello-jni");
}
}
class ShapeView extends View
{
private Bitmap mBitmap;
private static native Object drawingShape();
public ShapeView(Context context) {
super(context);
}
protected void onDraw(Canvas canvas) {
mBitmap = (Bitmap) drawingShape();
canvas.drawBitmap(mBitmap, 0, 0, null);
}
}
And in the native jni class,
#include "ShapeView.h"
#include <string.h>
#include <jni.h>
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <iostream>
JNIEXPORT jobject JNICALL Java_ShapeView_drawingShape
(JNIEnv *env, jclass obj) {
IplImage* img = cvCreateImage(cvSize(300, 300), IPL_DEPTH_8U, 3);
cvRectangle(img, cvPoint(100, 100), cvPoint(200, 200), cvScalar(255, 0, 0), 1);
cvCircle(img, cvPoint(100, 100), 20, cvScalar(0, 255, 0), 1);
return (jobject)img;
}
I compiled the native code using ndk-build. The .so file also created and installed without any errors. In the Java code there isn't any error either.
When I run the application in the AVD, I can view only a black screen, and it disappears in few seconds.
What mistake is in the code?
If SWIG is mandatory, how it can be done? I don't have much knowledge on OpenCV and SWIG.
You could use JavaCV instead. It gives full access to OpenCV in Java. No need of JNI, SWIG, or the NDK...
http://code.google.com/p/javacv/
You may pass the bytearray
or intarray
to the JNI and use the GetByteArrayElements()
to access the array as pointer. Directly modify the array data and release it at the end. The array data in the Java code file should be changed.
extern "C"
jboolean
Java_my_project_MyRealTimeImageProcessing_CameraPreview_ImageProcessing(
JNIEnv* env, jobject thiz,
jint width, jint height,
jbyteArray NV21FrameData, jintArray outPixels)
{
jbyte * pNV21FrameData = env->GetByteArrayElements(NV21FrameData, 0);
jint * poutPixels = env->GetIntArrayElements(outPixels, 0);
if ( mCanny == NULL )
{
mCanny = new Mat(height, width, CV_8UC1);
}
Mat mGray(height, width, CV_8UC1, (unsigned char *)pNV21FrameData);
Mat mResult(height, width, CV_8UC4, (unsigned char *)poutPixels);
IplImage srcImg = mGray;
IplImage CannyImg = *mCanny;
IplImage ResultImg = mResult;
cvCanny(&srcImg, &CannyImg, 80, 100, 3);
cvCvtColor(&CannyImg, &ResultImg, CV_GRAY2BGRA);
env->ReleaseByteArrayElements(NV21FrameData, pNV21FrameData, 0);
env->ReleaseIntArrayElements(outPixels, poutPixels, 0);
return true;
}
The sample code can be found this article How to do real time image processing in Android using OpenCV?.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With