Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Raw Depth map SDK for IPhone X

I did some search and found various examples, documentation on iPhone X Face ID and how it can be used for various stuff like authentication, animated emojis.

Wanted to check if there is an API/SDK to get the raw depth map from iPhone X sensor to the app?

From my understanding the depth calculation is done based on the projected pattern. This can be used to get depth profile of any object in front of the sensor. (Might be dependent on the texture of the object.)

like image 840
Anil Maddala Avatar asked Sep 25 '17 20:09

Anil Maddala


People also ask

Does iPhone X have true depth?

On the iPhone X, Apple has implemented it in two distinct ways: On the front of the iPhone X, the TrueDepth's various IR and dot sensors help measure depth, while the dual-lens rear camera system estimates depth by using the two lenses and machine learning.

Does iOS 16 depth effect work on iPhone X?

Even though iPhone X, iPhone 8, and iPhone 8 Plus are compatible with iOS 16, you can't use Depth Effect Lock Screen wallpaper on these models as they don't have an A12 Bionic or newer processor. An older chip doesn't offer the muscle power to swiftly understand an image and separate its subject from the background.

Does iPhone have depth sensor?

The iPhone 12 Pro and 13 Pro cameras work better with lidar The lidar depth-sensing is also used to improve night portrait mode effects.


1 Answers

You'll need at least the iOS 11.1 SDK in Xcode 9.1 (both in beta as of this writing). With that, builtInTrueDepthCamera becomes one of the camera types you use to select a capture device:

let device = AVCaptureDevice.default(.builtInTrueDepthCamera, for: .video, position: .front)

Then you can go on to set up an AVCaptureSession with the TrueDepth camera device, and can use that capture session to capture depth information much like you can with the back dual camera on iPhone 7 Plus and 8 Plus:

  • Turn on depth capture for photos with AVCapturePhotoOutput.isDepthDataDeliveryEnabled, then snap a picture with AVCapturePhotoSettings.isDepthDataDeliveryEnabled. You can read the depthData from the AVCapturePhoto object you receive after the capture, or turn on embedsDepthDataInPhoto if you just want to fire and forget (and read the data from the captured image file later).

  • Get a live feed of depth maps with AVCaptureDepthDataOutput. That one is like the video data output; instead of recording directly to a movie file, it gives your delegate a timed sequence of image (or in this case, depth) buffers. If you're also capturing video at the same time, AVCaptureDataOutputSynchronizer might be handy for making sure you get coordinated depth maps and color frames together.

As Apple's Device Compatibility documentation notes, you need to select the builtInTrueDepthCamera device to get any of these depth capture options. If you select the front-facing builtInWideAngleCamera, it becomes like any other selfie camera, capturing only photo and video.


Just to emphasize: from an API point of view, capturing depth with the front-facing TrueDepth camera on iPhone X is a lot like capturing depth with the back-facing dual cameras on iPhone 7 Plus and 8 Plus. So if you want a deep dive on how all this depth capture business works in general, and what you can do with captured depth information, check out the WWDC17 Session 507: Capturing Depth in iPhone Photography talk.

like image 199
rickster Avatar answered Oct 19 '22 13:10

rickster