In the original (now deprecated) camera API, we used to be able to get preview frames in the Camera.PreviewCallback and be able to process it (taking possibly very long) and release the buffer to be able to receive another frame, without lagging the screen preview, with some code like the following:
public void onPreviewFrame(final byte[] data, Camera camera) {
new AsyncTask<Void, Void, Void>() {
@Override
protected Void doInBackground(Void... params) {
(... do some slow processing ...)
}
@Override
protected void onPostExecute(Void aVoid) {
mCamera.addCallbackBuffer(data); // free the buffer to be able
// to process another frame
}
}.execute();
}
The API would only callback with a new frame if there was another buffer available to receive it, without lagging the screen preview.
I'm trying to replicate the same behaviour on the new Camera2 API, but I can't find a way to do it without lagging the screen preview. If I add a second target (same resolution as the screen one, YUV_420_888) to the preview request:
mPreviewRequestBuilder.addTarget(surface);
mPreviewRequestBuilder.addTarget(previewImageReader.getSurface());
mCameraDevice.createCaptureSession(
Arrays.asList(surface, previewImageReader.getSurface()), ...
the screen preview will lag, even if I just close the image as soon as I get it:
@Override
public void onImageAvailable(ImageReader reader) {
reader.acquireNextImage().close();
}
What's the correct way to use Camera2 to emulate the original camera API behaviour (i.e having a new buffer whenever one is free and not slowing the screen preview)?
Update: In case anyone is wondering how the rest of the code looks like, it is just a modified version of the standard android-camera2Basic sample, here's what I've changed.
Here is a sample of an async function launched from the onSurfaceTextureUpdated
method. If you only want to run one async task in the background at a time, you can use a flag to check if the previous task has completed.
private final TextureView.SurfaceTextureListener mSurfaceTextureListener
= new TextureView.SurfaceTextureListener() {
boolean processing;
@Override
public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {
openCamera(width, height);
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {
configureTransform(width, height);
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) {
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture texture) {
if (processing) {
return;
}
processing = true;
Bitmap photo = mTextureView.getBitmap();
new ImageTask(photo, new ImageResponse() {
@Override
public void processFinished() {
processing = false;
}
}).execute();
}
};
private interface ImageResponse {
void processFinished();
}
private class ImageTask extends AsyncTask<Void, Void, Exception> {
private Bitmap photo;
private ImageResponse imageResponse;
ImageTask(Bitmap photo, ImageResponse imageResponse) {
this.photo = photo;
this.imageResponse = imageResponse;
}
@Override
protected Exception doInBackground(Void... params) {
// do background work here
imageResponse.processFinished();
return null;
}
@Override
protected void onPostExecute(Exception result) {
}
}
If anyone is still interested.
Create a SurfaceTextureListener
and call your async function from the onSurfaceTextureUpdated
method. I have used this successfully when checking frames for barcodes with the BarcodeDetection API and the Camera 2 API.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With