Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Camera2 understanding the sensor and device orientations

I bumped into an issue while trying to implement a touch to focus feature using the Android Camera2.

The theory is easy:

  • obtain the tap position in the preview surface
  • map it to the dimensions of the sensor or sensor crop area (in case of a zoom) making sure to invert the dimensions if needed
  • apply a change of basis to end up in the same basis as the sensor
  • make a MeteringRectangle out of the result and use it in a new CaptureRequest

There are a number of examples out there that show how to deal with the first and last point, but not a lot that deal with the second and third in an understandable way. The docs and examples are not really clear and it can be really confusing.

Here we go...


The CameraCharacteristics.SENSOR_ORIENTATION is describe as

Clockwise angle through which the output image needs to be rotated to be upright on the device screen in its native orientation.

Knowing that the sensor coordinate system is defined with (0,0) being the top-left pixel in the active pixel array, I read this as being the angle needed to rotate the image captured in the sensor coordinate system to the position that would make the image look upright in the native orientation. So if the sensor's top is facing the right side of a phone with a portrait native orientation, the SENSOR_ORIENTATION will be 90°. Sensor Orientation


The display orientation obtained via mActivity.getWindowManager().getDefaultDisplay().getRotation(); is documented as:

Returns the rotation of the screen from its "natural" orientation. The returned value may be Surface.ROTATION_0 (no rotation), Surface.ROTATION_90, Surface.ROTATION_180, or Surface.ROTATION_270. For example, if a device has a naturally tall screen, and the user has turned it on its side to go into a landscape orientation, the value returned here may be either Surface.ROTATION_90 or Surface.ROTATION_270 depending on the direction it was turned. The angle is the rotation of the drawn graphics on the screen, which is the opposite direction of the physical rotation of the device. For example, if the device is rotated 90 degrees counter-clockwise, to compensate rendering will be rotated by 90 degrees clockwise and thus the returned value here will be Surface.ROTATION_90.

I find that this definition is much clearer than the sensor orientation one, there is no place for interpretation.


Now where things start to get ugly...

I decided to make use of the method provided in the Camera2Raw example to obtain the rotation from the sensor orientation to the device orientation.

/**  * Rotation need to transform from the camera sensor orientation to the device's current  * orientation.  *  * @param c                 the {@link CameraCharacteristics} to query for the camera sensor  *                          orientation.  * @param deviceOrientation the current device orientation relative to the native device  *                          orientation.  * @return the total rotation from the sensor orientation to the current device orientation.  */ private static int sensorToDeviceRotation(CameraCharacteristics c, int deviceOrientation) {     int sensorOrientation = c.get(CameraCharacteristics.SENSOR_ORIENTATION);      // Get device orientation in degrees     deviceOrientation = ORIENTATIONS.get(deviceOrientation);      // Reverse device orientation for front-facing cameras     if (c.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {         deviceOrientation = -deviceOrientation;     }      // Calculate desired JPEG orientation relative to camera orientation to make     // the image upright relative to the device orientation     return (sensorOrientation + deviceOrientation + 360) % 360; } 

Here is a table of the different outputs for a back and front facing camera on a phone with a portrait native orientation. enter image description here

The first thing I noticed is that if I consider the outputs as described (rotation from the camera sensor orientation to the device's current orientation), for it to make sense, I have to consider the output rotation to be counter-clockwise (unlike the sensor orientation and device orientation)!! For instance if we take the typical 90° sensor and 0° device orientation, the resut is 90° and if I'm not mistaken in my analysis, it can only be counter-clockwise.

Under the hypothesis that my understanding of the sensor and device orientations is correct (not certain about that), then something must be wrong with the results of the above tables, because if you look at the 90° sensor and the 90° device orientation case, it can't be 180° it should be 0°. The next picture is a visual representation of my understanding of all of this for a 90° sensor orientation phone. enter image description here

I went ahead and implemented my basis changes in R2 to get my tap point from the screen basis to the sensor basis and added the expected offsets.

I observed that if I switch the 180° and 0° calculations, then my touch to focus works perfectly. The correct observed values for the rotation from the sensor to the current device orientation actually correspond to the table for the front camera.

So my gut feeling is that the sensorToDeviceRotation is flawed and the return value should be:

// Calculate desired JPEG orientation relative to camera orientation to make // the image upright relative to the device orientation return (sensorOrientation - deviceOrientation + 360) % 360; 

It would actually be more logical in terms of what is computed...

Could someone confirm this? or have I misunderstood something somewhere?

Cheers

like image 458
Dude Avatar asked Jan 23 '18 16:01

Dude


People also ask

How do I fix the camera orientation on my Android?

Find and turn on the "Auto-rotate" tile in the quick-setting panel. You can also go to Settings > Display > Auto-rotate screen to turn it on. Your phone screen should rotate automatically now if nothing is wrong with the sensors.

What is screen orientation sensor?

"sensor" - The orientation is determined by the device orientation sensor. The orientation of the display depends on how the user is holding the device; it changes when the user rotates the device. Some devices, though, will not rotate to all four possible orientations, by default.

What is Camera2 in Android?

Camera2 is the latest low-level Android camera package and replaces the deprecated Camera class. Camera2 provides in-depth controls for complex use cases, but requires you to manage device-specific configurations.

What is camera preview?

PreviewView is a subclass of FrameLayout . To display the camera feed, it uses either a SurfaceView or TextureView , provides a preview surface to the camera when it's ready, tries to keep it valid as long as the camera is using it, and when released prematurely, provides a new surface if the camera is still in use.


1 Answers

Yes, this is a bug in Camera2Raw; thank you for catching it.

If you compare the sample code in the reference docs for Camera#setDisplayOrientation, you'll see the math you'd expect:

... if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {      result = (info.orientation + degrees) % 360;      result = (360 - result) % 360;  // compensate the mirror } else {  // back-facing      result = (info.orientation - degrees + 360) % 360; } 
like image 170
Eddy Talvala Avatar answered Sep 19 '22 03:09

Eddy Talvala