Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using the raw camera byte[] array for augmented reality

I'm developing an Augmented Reality app, so I need to capture the camera preview, add visual effects to it, and display it on screen. I would like to do this using the onPreviewFrame method of PreviewCallback. This gives me a byte[] variable containing raw image data (YUV420 encoded) to work with.

Even though I searched for a solution for many hours, I cannot find a way to convert this byte[] variable to any image format I can work with or even draw on the screen.

Preferably, I would convert the byte[] data to some RGB format that can be used both for computations and drawing.

Is there a proper way to do this?

like image 403
RemiX Avatar asked Dec 02 '22 02:12

RemiX


1 Answers

I stumbled upon the same issue a few months back when I had to do some edge detection on the camera frames. This works perfectly for me. Try it out.

public void surfaceChanged(SurfaceHolder holder,int format, int width,int height) 
        {
            camera.setPreviewCallback(new PreviewCallback() {

                public void onPreviewFrame(byte[] data, Camera camera) {

                    Camera.Parameters parameters = camera.getParameters();

                    int width = parameters.getPreviewSize().width;
                    int height = parameters.getPreviewSize().height;

                    ByteArrayOutputStream outstr = new ByteArrayOutputStream();
                    Rect rect = new Rect(0, 0, width, height); 
                    YuvImage yuvimage=new YuvImage(data,ImageFormat.NV21,width,height,null);
                    yuvimage.compressToJpeg(rect, 100, outstr);
                    Bitmap bmp = BitmapFactory.decodeByteArray(outstr.toByteArray(), 0, outstr.size());
                }
}
}

You can use the bitmap for all your processing purposes now. Get the interested pixel and you can comfortably do your RGB or HSV stuff on it.

like image 186
Bornfree Avatar answered Dec 03 '22 16:12

Bornfree