Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Save depth images from TrueDepth camera

I am trying to save depth images from the iPhoneX TrueDepth camera. Using the AVCamPhotoFilter sample code, I am able to view the depth, converted to grayscale format, on the screen of the phone in real-time. I cannot figure out how to save the sequence of depth images in the raw (16 bits or more) format.

I have depthData which is an instance of AVDepthData. One of its members is depthDataMap which is an instance of CVPixelBuffer and image format type kCVPixelFormatType_DisparityFloat16. Is there a way to save it to the phone to transfer for offline manipulation?

like image 754
dustymax Avatar asked Dec 05 '17 23:12

dustymax


People also ask

What can you do with TrueDepth camera?

The TrueDepth camera captures accurate face data by projecting and analysing thousands of invisible dots to create a depth map of your face. It also captures an infrared image of your face.

How accurate is TrueDepth camera?

CONCLUSIONS. The TrueDepth sensor of the iPhone X is suitable for distance measurements according to the investigations carried out and the results obtained. The errors are in the millimeter range and are a maximum of 5 % of the target distance.

How does the TrueDepth camera work?

The TrueDepth camera produces disparity maps by default so that the resulting depth data is similar to that produced by a dual camera device. However, unlike a dual camera device, the TrueDepth camera can directly measure depth (in meters) with AVDepthData.Accuracy.absolute accuracy.

How to fix TrueDepth camera issues?

It has been noted that setting up Face in Airplane helped the users to fix Truedepth camera issues. Here’s how to do it. 1. Select the Face ID & Passcode icon after navigating to Settings. 2. Click the “Delete Face” icon after hitting Face ID.

Can the TrueDepth camera measure depth instead of disparity?

However, unlike a dual camera device, the TrueDepth camera can directly measure depth (in meters) with AVDepthData.Accuracy.absolute accuracy. To capture depth instead of disparity, set the activeDepthDataFormat of the capture device before starting your capture session:

What is a depth map on iOS devices?

On iOS devices with a back-facing dual camera or a front-facing TrueDepth camera, the capture system can record depth information. A depth map is like an image; however, instead of each pixel providing a color, it indicates distance from the camera to that part of the image (either in absolute terms, or relative to other pixels in the depth map).


1 Answers

There's no standard video format for "raw" depth/disparity maps, which might have something to do with AVCapture not really offering a way to record it.

You have a couple of options worth investigating here:

  1. Convert depth maps to grayscale textures (which you can do using the code in the AVCamPhotoFilter sample code), then pass those textures to AVAssetWriter to produce a grayscale video. Depending on the video format and grayscale conversion method you choose, other software you write for reading the video might be able to recover depth/disparity info with sufficient precision for your purposes from the grayscale frames.

  2. Anytime you have a CVPixelBuffer, you can get at the data yourself and do whatever you want with it. Use CVPixelBufferLockBaseAddress (with the readOnly flag) to make sure the content won't change while you read it, then copy data from the pointer CVPixelBufferGetBaseAddress provides to wherever you want. (Use other pixel buffer functions to see how many bytes to copy, and unlock the buffer when you're done.)

    Watch out, though: if you spend too much time copying from buffers, or otherwise retain them, they won't get deallocated as new buffers come in from the capture system, and your capture session will hang. (All told, it's unclear without testing whether a device has the memory & I/O bandwidth for much recording this way.)

like image 56
rickster Avatar answered Oct 28 '22 12:10

rickster