I'm trying to get the Android camera2 running in the background service, then process the frame in the callback ImageReader.OnImageAvailableListener. I already use the suggested raw format YUV_420_888 to get max fps, however I only get around 7fps on the resolution 640x480. This is even slower than what I get using the old Camera interface( I want to upgrade to Camera2 to get higher fps ) or with the OpenCV JavaCameraView( I can't use this because I need to run processing in the background service ).
Below is my service class. What am I missing?
My phone is Redmi Note 3 running Android 5.0.2
public class Camera2ServiceYUV extends Service {
protected static final String TAG = "VideoProcessing";
protected static final int CAMERACHOICE = CameraCharacteristics.LENS_FACING_BACK;
protected CameraDevice cameraDevice;
protected CameraCaptureSession captureSession;
protected ImageReader imageReader;
// A semaphore to prevent the app from exiting before closing the camera.
private Semaphore mCameraOpenCloseLock = new Semaphore(1);
public static final String RESULT_RECEIVER = "resultReceiver";
private static final int JPEG_COMPRESSION = 90;
public static final int RESULT_OK = 0;
public static final int RESULT_DEVICE_NO_CAMERA= 1;
public static final int RESULT_GET_CAMERA_FAILED = 2;
public static final int RESULT_ALREADY_RUNNING = 3;
public static final int RESULT_NOT_RUNNING = 4;
private static final String START_SERVICE_COMMAND = "startServiceCommands";
private static final int COMMAND_NONE = -1;
private static final int COMMAND_START = 0;
private static final int COMMAND_STOP = 1;
private boolean mRunning = false;
public Camera2ServiceYUV() {
}
public static void startToStart(Context context, ResultReceiver resultReceiver) {
Intent intent = new Intent(context, Camera2ServiceYUV.class);
intent.putExtra(START_SERVICE_COMMAND, COMMAND_START);
intent.putExtra(RESULT_RECEIVER, resultReceiver);
context.startService(intent);
}
public static void startToStop(Context context, ResultReceiver resultReceiver) {
Intent intent = new Intent(context, Camera2ServiceYUV.class);
intent.putExtra(START_SERVICE_COMMAND, COMMAND_STOP);
intent.putExtra(RESULT_RECEIVER, resultReceiver);
context.startService(intent);
}
// SERVICE INTERFACE
@Override
public int onStartCommand(Intent intent, int flags, int startId) {
switch (intent.getIntExtra(START_SERVICE_COMMAND, COMMAND_NONE)) {
case COMMAND_START:
startCamera(intent);
break;
case COMMAND_STOP:
stopCamera(intent);
break;
default:
throw new UnsupportedOperationException("Cannot start the camera service with an illegal command.");
}
return START_STICKY;
}
@Override
public void onDestroy() {
try {
captureSession.abortCaptures();
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
}
captureSession.close();
}
@Override
public IBinder onBind(Intent intent) {
return null;
}
// CAMERA2 INTERFACE
/**
* 1. The android CameraManager class is used to manage all the camera devices in our android device
* Each camera device has a range of properties and settings that describe the device.
* It can be obtained through the camera characteristics.
*/
public void startCamera(Intent intent) {
final ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);
if (mRunning) {
resultReceiver.send(RESULT_ALREADY_RUNNING, null);
return;
}
mRunning = true;
CameraManager manager = (CameraManager) getSystemService(CAMERA_SERVICE);
try {
if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
throw new RuntimeException("Time out waiting to lock camera opening.");
}
String pickedCamera = getCamera(manager);
Log.e(TAG,"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA " + pickedCamera);
manager.openCamera(pickedCamera, cameraStateCallback, null);
CameraCharacteristics characteristics = manager.getCameraCharacteristics(pickedCamera);
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.YUV_420_888);
}
int width = 640;
int height = 480;
// if (jpegSizes != null && 0 < jpegSizes.length) {
// width = jpegSizes[jpegSizes.length -1].getWidth();
// height = jpegSizes[jpegSizes.length - 1].getHeight();
// }
// for(Size s : jpegSizes)
// {
// Log.e(TAG,"Size = " + s.toString());
// }
// DEBUG
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
return;
}
Log.e(TAG,"Width = " + width + ", Height = " + height);
Log.e(TAG,"output stall duration = " + map.getOutputStallDuration(ImageFormat.YUV_420_888, new Size(width,height)) );
Log.e(TAG,"Min output stall duration = " + map.getOutputMinFrameDuration(ImageFormat.YUV_420_888, new Size(width,height)) );
// Size[] sizeList = map.getInputSizes(ImageFormat.YUV_420_888);
// for(Size s : sizeList)
// {
// Log.e(TAG,"Size = " + s.toString());
// }
imageReader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2 /* images buffered */);
imageReader.setOnImageAvailableListener(onImageAvailableListener, null);
Log.i(TAG, "imageReader created");
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
resultReceiver.send(RESULT_DEVICE_NO_CAMERA, null);
}catch (InterruptedException e) {
resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
}
catch(SecurityException se)
{
resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
throw new RuntimeException("Security permission exception while trying to open the camera.", se);
}
resultReceiver.send(RESULT_OK, null);
}
// We can pick the camera being used, i.e. rear camera in this case.
private String getCamera(CameraManager manager) {
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
int cOrientation = characteristics.get(CameraCharacteristics.LENS_FACING);
if (cOrientation == CAMERACHOICE) {
return cameraId;
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
return null;
}
/**
* 1.1 Callbacks when the camera changes its state - opened, disconnected, or error.
*/
protected CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
Log.i(TAG, "CameraDevice.StateCallback onOpened");
mCameraOpenCloseLock.release();
cameraDevice = camera;
createCaptureSession();
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
Log.w(TAG, "CameraDevice.StateCallback onDisconnected");
mCameraOpenCloseLock.release();
camera.close();
cameraDevice = null;
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
Log.e(TAG, "CameraDevice.StateCallback onError " + error);
mCameraOpenCloseLock.release();
camera.close();
cameraDevice = null;
}
};
/**
* 2. To capture or stream images from a camera device, the application must first create
* a camera capture captureSession.
* The camera capture needs a surface to output what has been captured, in this case
* we use ImageReader in order to access the frame data.
*/
public void createCaptureSession() {
try {
cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()), sessionStateCallback, null);
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
}
}
protected CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
Log.i(TAG, "CameraCaptureSession.StateCallback onConfigured");
// The camera is already closed
if (null == cameraDevice) {
return;
}
// When the captureSession is ready, we start to grab the frame.
Camera2ServiceYUV.this.captureSession = session;
try {
session.setRepeatingRequest(createCaptureRequest(), null, null);
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
}
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Log.e(TAG, "CameraCaptureSession.StateCallback onConfigureFailed");
}
};
/**
* 3. The application then needs to construct a CaptureRequest, which defines all the capture parameters
* needed by a camera device to capture a single image.
*/
private CaptureRequest createCaptureRequest() {
try {
/**
* Check other templates for further details.
* TEMPLATE_MANUAL = 6
* TEMPLATE_PREVIEW = 1
* TEMPLATE_RECORD = 3
* TEMPLATE_STILL_CAPTURE = 2
* TEMPLATE_VIDEO_SNAPSHOT = 4
* TEMPLATE_ZERO_SHUTTER_LAG = 5
*
* TODO: can set camera features like auto focus, auto flash here
* captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
*/
CaptureRequest.Builder captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
// captureRequestBuilder.set(CaptureRequest.EDGE_MODE,
// CaptureRequest.EDGE_MODE_OFF);
// captureRequestBuilder.set(
// CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE,
// CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_ON);
// captureRequestBuilder.set(
// CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE,
// CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
// captureRequestBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE,
// CaptureRequest.NOISE_REDUCTION_MODE_OFF);
// captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
// CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
//
// captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
// captureRequestBuilder.set(CaptureRequest.CONTROL_AWB_LOCK, true);
captureRequestBuilder.addTarget(imageReader.getSurface());
return captureRequestBuilder.build();
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
return null;
}
}
/**
* ImageReader provides a surface for the camera to output what has been captured.
* Upon the image available, call processImage() to process the image as desired.
*/
private long frameTime = 0;
private ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
Log.i(TAG, "called ImageReader.OnImageAvailable");
Image img = reader.acquireLatestImage();
if (img != null) {
if( frameTime != 0 )
{
Log.e(TAG, "fps = " + (float)(1000.0 / (float)(SystemClock.elapsedRealtime() - frameTime)) + " fps");
}
frameTime = SystemClock.elapsedRealtime();
img.close();
}
}
};
private void processImage(Image image) {
Mat outputImage = imageToMat(image);
Bitmap bmp = Bitmap.createBitmap(outputImage.cols(), outputImage.rows(), Bitmap.Config.ARGB_8888);
Utils.bitmapToMat(bmp, outputImage);
Point mid = new Point(0, 0);
Point inEnd = new Point(outputImage.cols(), outputImage.rows());
Imgproc.line(outputImage, mid, inEnd, new Scalar(255, 0, 0), 2, Core.LINE_AA, 0);
Utils.matToBitmap(outputImage, bmp);
Intent broadcast = new Intent();
broadcast.setAction("your_load_photo_action");
broadcast.putExtra("BitmapImage", bmp);
sendBroadcast(broadcast);
}
private Mat imageToMat(Image image) {
ByteBuffer buffer;
int rowStride;
int pixelStride;
int width = image.getWidth();
int height = image.getHeight();
int offset = 0;
Image.Plane[] planes = image.getPlanes();
byte[] data = new byte[image.getWidth() * image.getHeight() * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
byte[] rowData = new byte[planes[0].getRowStride()];
for (int i = 0; i < planes.length; i++) {
buffer = planes[i].getBuffer();
rowStride = planes[i].getRowStride();
pixelStride = planes[i].getPixelStride();
int w = (i == 0) ? width : width / 2;
int h = (i == 0) ? height : height / 2;
for (int row = 0; row < h; row++) {
int bytesPerPixel = ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;
if (pixelStride == bytesPerPixel) {
int length = w * bytesPerPixel;
buffer.get(data, offset, length);
// Advance buffer the remainder of the row stride, unless on the last row.
// Otherwise, this will throw an IllegalArgumentException because the buffer
// doesn't include the last padding.
if (h - row != 1) {
buffer.position(buffer.position() + rowStride - length);
}
offset += length;
} else {
// On the last row only read the width of the image minus the pixel stride
// plus one. Otherwise, this will throw a BufferUnderflowException because the
// buffer doesn't include the last padding.
if (h - row == 1) {
buffer.get(rowData, 0, width - pixelStride + 1);
} else {
buffer.get(rowData, 0, rowStride);
}
for (int col = 0; col < w; col++) {
data[offset++] = rowData[col * pixelStride];
}
}
}
}
// Finally, create the Mat.
Mat mat = new Mat(height + height / 2, width, CV_8UC1);
mat.put(0, 0, data);
return mat;
}
private void stopCamera(Intent intent) {
ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);
if (!mRunning) {
resultReceiver.send(RESULT_NOT_RUNNING, null);
return;
}
closeCamera();
resultReceiver.send(RESULT_OK, null);
mRunning = false;
Log.d(TAG, "Service is finished.");
}
/**
* Closes the current {@link CameraDevice}.
*/
private void closeCamera() {
try {
mCameraOpenCloseLock.acquire();
if (null != captureSession) {
captureSession.close();
captureSession = null;
}
if (null != cameraDevice) {
cameraDevice.close();
cameraDevice = null;
}
if (null != imageReader) {
imageReader.close();
imageReader = null;
}
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
} finally {
mCameraOpenCloseLock.release();
}
}
}
I bumped into this problem recently when I try to upgrade my AR app from camera1 to camera2 API, I used a mid-range device for testing (Meizu S6) which has Exynos 7872
CPU and Mali-G71
GPU. What I want to achieve is a steady 30fps AR experience.
But through the migration I found that its quite tricky to get a decent preview frame rate using Camera2 API.
I configured my capture request using TEMPLATE_PREVIEW
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
Then I Put 2 surfaces, one for preview which is a surfaceTexture at size 1280x720, another ImageReader at size 1280x720 for image processing.
mImageReader = ImageReader.newInstance(
mVideoSize.getWidth(),
mVideoSize.getHeight(),
ImageFormat.YUV_420_888,
2);
List<Surface> surfaces =new ArrayList<>();
Surface previewSurface = new Surface(mSurfaceTexture);
surfaces.add(previewSurface);
mPreviewBuilder.addTarget(previewSurface);
Surface frameCaptureSurface = mImageReader.getSurface();
surfaces.add(frameCaptureSurface);
mPreviewBuilder.addTarget(frameCaptureSurface);
mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CameraMetadata.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), captureCallback, mBackgroundHandler);
Everything works as expected, my TextureView gets updated and framecallback gets called too Except ... the frame rate is about 10 fps and I haven't even do any image processing yet.
I have experimented many Camera2 API settings include SENSOR_FRAME_DURATION
and different ImageFormat
and size combinations but none of them improve the frame rate. But if I just remove the ImageReader from output surfaces, then preview gets 30 fps easily!
So I guess the problem is By adding ImageReader as Camera2 output surface decreased the preview frame rate drastically. At least on my case, so what is the solution?
My solution is glReadPixel
I know glReadPixel is one of the evil things because it copy bytes from GPU to main memory and also causing OpenGL to flush draw commands thus for sake of performance we'd better avoid using it. But its surprising that glReadPixel is actually pretty fast and providing much better frame rate then ImageReader's YUV_420_888
output.
In addition to reduce the memory overhead I make another draw call with smaller frame buffer like 360x640 instead of preview's 720p dedicated for feature detection.
Based on the implementation of camera2 by the openCV library. I had the same problem, then I noticed this piece of code in the openCV code for the JavaCamera2View, you need to change the settings of the CaptureRequest.Builder that way:
CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
It changed the fps from 10fps to around 28-30fps for me. Worked for me with two target surfaces, one surface of the preview textureview, the second of the ImageReader:
Surface readerSurface = imageReader.getSurface();
Surface surface = new Surface(surfaceTexture);
captureBuilder.addTarget(surface);
captureBuilder.addTarget(readerSurface);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With