Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Camera2 With Multiple Outputs

Here is my issue, I am currently unable to populate one of 3 Surfaces with a still captured image from the Camera2API after invoking a capture. I used the Google Camera2 Getting Started (need link) to set up the boilerplate so a good chunk of this code may look familiar.

The Camera2 capture states:

Each request will produce one CaptureResult and produce new frames for one or more target Surfaces, set with the CaptureRequest builder's addTarget(Surface) method. The target surfaces (set with addTarget(Surface)) must be a subset of the surfaces provided when this capture session was created.

Okay Android, I'll abide

// in order to use a surface they must be registered when creating the session
List<Surface> surfaces = new ArrayList<>(previewSurfaces); //size + 1
surfaces.addAll(displaySurfaces); // size + 1
surfaces.add(mImageReader.getSurface()); //size + 1
try {
    mCameraDevice.createCaptureSession(surfaces, mCaptureSessionCallback , null);
 ...

And then do this

// Add the display surfaces along with the internal image reader
captureBuilder.addTarget(mImageReader.getSurface());
for(Surface surface : this.displaySurfaces) {
     captureBuilder.addTarget(surface);  //theres only one
}
CaptureRequest captureRequest = captureBuilder.build();
mCaptureSession.capture(captureRequest, captureCallback, mBackgroundHandler);

At this point, the surfaces that were added to the request that was handed to the capture method will output the camera buffer to all the surfaces. However, I only see the mImageReader's callback get invoked while the displaySurfaces gets no love.

So I started poking around with the debugger and noted that the request handed to the capture has a set of outputs of size 2, and, the size of the camera session's internal output set if of size 3 (both correct). However, the display surface shows nothing. The other surface displays a preview and that works fine.

Things that I have checked for: a) Yes, the view is visible and its surface is ready b) The view has a resolution that is supported by the camera and has been set using setFixedSize()

What's happening here I feel like I'm missing something. Help!

Edit1

Here is how the surfaces are generated;

private SurfaceView getPreviewView(Context context) throws CameraAccessException {
        // TODO set z overlay?
        Size previewSize = mCameraController.determineLargestSize();
        Log.d(LOG_TAG, "Setting preview dimensions to: \n" +
            "width: " + Integer.toString(previewSize.getWidth()) + "\n" +
            "height " + Integer.toString(previewSize.getHeight()));
        SurfaceView view = new SurfaceView(context);
        view.getHolder().addCallback(mPreviewViewCallback);
        view.getHolder().setFixedSize(previewSize.getWidth(),previewSize.getHeight());
        view.setOnTouchListener(mZoomFocusListener);
        return view;
    }

private SurfaceView getDisplayTile(Context context) throws CameraAccessException {
        SurfaceView previewTile = new SurfaceView(context);
        Size smallest = mCameraController.determineSmallestSize();
        int width = smallest.getWidth();
        int height = smallest.getHeight();
        Log.d(LOG_TAG, "Setting display tile dimensions to: \n" +
                "width: " + Integer.toString(smallest.getWidth()) + "\n" +
                "height " + Integer.toString(smallest.getHeight()));
        previewTile.getHolder().addCallback(mDisplayCallback);
        previewTile.getHolder().setFixedSize(width,height);
        previewTile.setZOrderMediaOverlay(true);
        previewTile.getHolder().setFormat(PixelFormat.TRANSLUCENT);
        previewTile.setBackgroundColor(Color.BLUE);
        RelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams(width,height);
        params.addRule(RelativeLayout.ALIGN_PARENT_TOP);
        params.addRule(RelativeLayout.ALIGN_PARENT_END);
        params.addRule(RelativeLayout.ALIGN_PARENT_RIGHT);
        params.setMargins(7,7,7,7);
        previewTile.setLayoutParams(params);
        return previewTile;
    }

And here is how I bring it all together

private void initViews(Context context) throws CameraAccessException {
        mFrameLayout = new FrameLayout(context);
        // Add the view for the camera preview
        FrameLayout.LayoutParams surfaceParams = new FrameLayout.LayoutParams(
                FrameLayout.LayoutParams.MATCH_PARENT,
                FrameLayout.LayoutParams.MATCH_PARENT);
        mPreviewView = this.getPreviewView(context);
        mFrameLayout.addView(mPreviewView,surfaceParams);

        // Add the preview tile to the layout
        FrameLayout.LayoutParams displayParams = new FrameLayout.LayoutParams(
                FrameLayout.LayoutParams.WRAP_CONTENT,
                FrameLayout.LayoutParams.WRAP_CONTENT);
        displayParams.gravity = Gravity.BOTTOM | Gravity.RIGHT;
        RelativeLayout layout = new RelativeLayout(context);
        mCapturedTile = this.getDisplayTile(context);
        layout.addView(mCapturedTile);
        mFrameLayout.addView(layout, displayParams);
}

Here's a screenshot of the app when no preview running. I colored the bottom right box as blue, that's the display surface that is not showing the camera output.

I'm also attaching Logcat, to give you and idea of whats happening

12-19 16:00:33.185 29863-29863/ngc.com.camera2app I/CameraController: This device has 2 available cameras
12-19 16:00:33.202 29863-29863/ngc.com.camera2app D/CameraController: Retrieveing current crop region
12-19 16:00:33.206 29863-29863/ngc.com.camera2app D/CameraView: Setting preview dimensions to: 
                                                                width: 3264
                                                                height 2448
12-19 16:00:33.208 29863-29863/ngc.com.camera2app D/CameraView: Setting display tile dimensions to: 
                                                                width: 176
                                                                height 144
12-19 16:00:33.948 29863-29863/ngc.com.camera2app D/CameraView: Surface created
12-19 16:00:33.949 29863-29863/ngc.com.camera2app D/CameraView: preview surface changed dimensions are
12-19 16:00:33.950 29863-29863/ngc.com.camera2app D/CameraView: width : 3264 | height : 2448
12-19 16:00:33.964 29863-29863/ngc.com.camera2app D/CameraView: Surface created
12-19 16:00:33.964 29863-29863/ngc.com.camera2app D/CameraView: display surface changed dimensions are
12-19 16:00:33.964 29863-29863/ngc.com.camera2app D/CameraView: width : 176 | height : 144
12-19 16:00:33.973 29863-29863/ngc.com.camera2app I/CameraController: Attempting to open camera
12-19 16:00:34.048 29863-29929/ngc.com.camera2app I/CameraController: Camera 0 is open for business
12-19 16:00:34.055 29863-29863/ngc.com.camera2app W/PathParser: Points are too far apart 4.000000596046461
12-19 16:00:34.065 29863-29929/ngc.com.camera2app D/CameraView: Camera ready
12-19 16:00:48.343 29863-29929/ngc.com.camera2app I/CameraController: Image available

I'm still stuck and thinking of punting this overtly subtle complicated bug and simply load the image from the callback of ImageReader. However, if the API says it can do it, why doesn't it do it!!

Edit2

This method gets invoked when both of the callbacks from the surfaces have gone through the onChanged()

private synchronized void openCamera(){
        if(displayReady && previewReady){
            try {
                List<Surface> displaySurfaces = new ArrayList<>();
                //uncommenting this line adds a nasty bug
                displaySurfaces.add(mPreviewView.getHolder().getSurface());

                List<Surface> previewSurfaces = new ArrayList<>();
                previewSurfaces.add(mCapturedTile.getHolder().getSurface());

                mCameraController.openCamera(displaySurfaces, previewSurfaces);
            } catch (CameraAccessException e) {
                e.printStackTrace();
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }
    }

Edit3 Changed the resolutions to be the same for both displays and the result is the same. The SystemUI gets very distorted (the notification bar ontop gets duplicated across the entire screen, bottom soft buttons move to the middle of the screen, very weird). I attempted to take a picture of this behavior, yet this is what I get

enter image description here

Note that the screenshot above is not what the UI is displaying for me. In addition, after the app closes and I restart it, half of the preview screen is green (again, cannot capture this via a screenshot). Any thoughts? I was starting to think that the Camera2 API has a bug, however, I'm also starting to think that there is some configuration that needs to be set on these surfaces. Not sure why the weird UI behavior though...

like image 557
Clocker Avatar asked Nov 09 '22 04:11

Clocker


1 Answers

Thank you @EddyTalvala, turned out the issue was due to the resolution mismatch. PreviewSurface: 3264 x 2446 DisplaySurface: 176 X 148

I met in the middle and changed the resolutions of both of these to be 640,480 and that resolved the issue.

like image 192
Clocker Avatar answered Nov 15 '22 05:11

Clocker