I am displaying a live camera in SurfaceView
using camera.startPreview();
. Any idea on how I can get live RGB readings from the camera?
Thanks
I thought I could get the data converted from the SurfaceView
. But the best method to use is :
SurfaceView
.List item
camera = Camera.open();
cameraParam = camera.getParameters();
cameraParam.setPreviewFormat(ImageFormat.NV21);
camera.setDisplayOrientation(90);
camera.setParameters(cameraParam);
cameraParam = camera.getParameters();
camera.setPreviewDisplay(surfaceHolder);
cameraParam.setFlashMode(Parameters.FLASH_MODE_TORCH);
camera.setParameters(cameraParam);
camera.startPreview();
Then, I call the setPreviewCallback
and onPreviewFrame
to get the incoming frame, and convert it to RGB pixel array. Which I can then get intensity of each color in the picture by averaging all pixels intensity by running myPixels array through a for
loop, and checking Color.red(myPixels[i])
for each desired color (inside the onPreviewFrame
).
camera.setPreviewCallback(new PreviewCallback() {
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
int frameHeight = camera.getParameters().getPreviewSize().height;
int frameWidth = camera.getParameters().getPreviewSize().width;
// number of pixels//transforms NV21 pixel data into RGB pixels
int rgb[] = new int[frameWidth * frameHeight];
// convertion
int[] myPixels = decodeYUV420SP(rgb, data, frameWidth, frameHeight);
}
}
Where decodeYUV420SP
is found here.
I timed this operation to take about 200ms for each frame. Is there a faster way of doing it?
You can do similar something like below
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
Bitmap bitmap = BitmapFactory.decodeByteArray(data, offset, length);
int[] pix = new int[picw * pich];
bitmap.getPixels(pix, 0, picw, 0, 0, picw, pich);
int R, G, B,Y;
for (int y = 0; y < pich; y++){
for (int x = 0; x < picw; x++)
{
int index = y * picw + x;
int R = (pix[index] >> 16) & 0xff; //bitwise shifting
int G = (pix[index] >> 8) & 0xff;
int B = pix[index] & 0xff;
pix[index] = 0xff000000 | (R << 16) | (G << 8) | B;
}}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
};
here camera.takePicture(shutterCallback, rawCallback, jpegCallback);
method call on image capture time , so i think you need to do continually call this method while you camera is open .
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With