I'm building a real-time object detection apps using Android OpenCV. I'm using Android Camera2 API with TextureView to capture image. I want to add OpenCV code to do some real-time image processing and preview the result.
Here is my code for taking picture
protected void takePicture() {
if(null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
String timeStamp = new SimpleDateFormat("ddMMyyyy_HHmmss").format(new Date());
final File file = new File(Environment.getExternalStorageDirectory()+"/Billboard_" + timeStamp + ".jpg");
// get the location from the NetworkProvider
LocationManager lm = (LocationManager) this.getSystemService(Context.LOCATION_SERVICE);
LocationListener locationListener = new LocationListener() {
@Override
public void onLocationChanged(Location location) {
longitude = location.getLongitude();
latitude = location.getLatitude();
storeGeoCoordsToImage(file, location);
Log.e(TAG, "Latitude = " + latitude);
Log.e(TAG, "Longitude = " + longitude);
}
@Override
public void onProviderDisabled(String provider) {}
@Override
public void onProviderEnabled(String provider) {}
@Override
public void onStatusChanged(String provider, int status,Bundle extras) {}
};
// update location listener
lm.requestLocationUpdates(LocationManager.NETWORK_PROVIDER, 0, 0, locationListener);
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(MainActivity.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
I want to add some Java OpenCV code like this and preview the result on the screen.
Mat destination = new Mat();
Imgproc.cvtColor(source, destination, Imgproc.COLOR_RGB2GRAY);
Imgproc.equalizeHist(destination, destination);
Imgproc.canny(destination, destination, 50, 150);
I'm confused how can I get the image from camera preview and do some image processing to it, then displaying the result.
Any help with OpenCV and Camera2 API code integration would be helpful. Thank you.
So, if you want to process a photo from camera, you should:
For example you may do it like this:
// you need to create some interface in your activity to update image on the screen
// and initialize it before you will use it
OnImageReadyListener onImageReadyListener = null;
//...
private final ImageReader.OnImageAvailableListener onImageAvailableListener = (ImageReader imReader) -> {
final Image image = imReader.acquireLatestImage();
// 1st step: convert the image to Mat
Mat source = ImageConverter.ImageToMat(image);
// 2nd step: process it with OpenCV
Mat destination = new Mat();
Imgproc.cvtColor(source, destination, Imgproc.COLOR_RGB2GRAY);
Imgproc.equalizeHist(destination, destination);
Imgproc.canny(destination, destination, 50, 150);
// 3rd step: publish your result
if(onImageReadyListener != null)
onImageReadyListener.getImage(ImageConverter.MatToBitmap(destination));
image.close();
};
So, your listener may looks like this:
// setup this interface in your activity where you will update your image
public interface OnImageReadyListener {
public getImage(Bitmap image); // then you should override this method
}
And the ImageConverter class should be like that:
public class ImageConverter {
private static final String TAG = ImageConverter.class.getSimpleName();
// Convert bitmap from JPEG to ARGB8888 format
private static Bitmap JPEGtoARGB8888(Bitmap input ){
Bitmap output = null;
int size = input.getWidth() * input.getHeight();
int[] pixels = new int[size];
input.getPixels(pixels,0,input.getWidth(),0,0,input.getWidth(),input.getHeight());
output = Bitmap.createBitmap(input.getWidth(),input.getHeight(), Bitmap.Config.ARGB_8888);
output.setPixels(pixels, 0, output.getWidth(), 0, 0, output.getWidth(), output.getHeight());
return output; // ARGB_8888 formated bitmap
}
// Get image Mat from Bitmap
private static Mat BitmapToMat(Bitmap bitmap){
Bitmap bitmapARGB8888 = JPEGtoARGB8888(bitmap);
Mat imageMat = new Mat();
Utils.bitmapToMat(bitmapARGB8888, imageMat);
return imageMat;
}
// Convert camera Image data to OpenCV image Mat(rix)
public static Mat ImageToMat(Image image){
// check image
if(image == null)
return null;
// store image to bytes array
final ByteBuffer buffer = image.getPlanes()[0].getBuffer();
final byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
// get bitmap from bytes and convert it to Mat
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, picture.getImageData().length);
return BitmapToMat(bitmap);
}
// Inverse conversion after image processing to show it on device screen
public static Bitmap MatToBitmap(Mat image){
Bitmap bitmap = null;
Mat convertedMat = new Mat (image.height(), image.width(), CvType.CV_8U, new Scalar(4));
try {
Imgproc.cvtColor(image, convertedMat, Imgproc.COLOR_GRAY2RGBA, 4);
bitmap = Bitmap.createBitmap(convertedMat.cols(), convertedMat.rows(), Bitmap.Config.ARGB_8888);
Utils.matToBitmap(convertedMat, bitmap);
}
catch (CvException e){
Log.d(TAG, e.getMessage());
}
return bitmap;
}
}
P.S. Also I recommend to read this article to understand how to connect OpenCV to Android.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With