Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS TrueDepth frame to point cloud

I'm trying to obtain a 3D point cloud from a single TrueDepth frame (AVDepthData / CVPixelBuffer) in iOS. I came across the official documentation however I don't seem to be able to find the last missing piece: Streaming Depth Data from the TrueDepth Camera

This code perfectly renders the point cloud that I'm interested in but I can't find out how to obtain the world coordinates in meters from it and store them as a pcd file.

Is there a way to read out all of the 3D points of the metal depth texture or is this the wrong approach? If so where would I start from here?

Thanks for your help!

like image 577
Jaykob Avatar asked Dec 05 '25 20:12

Jaykob


1 Answers

In the shader function vertexShaderPoints take a look at the following:

uint2 pos;
pos.y = vertexID / depthTexture.get_width();
pos.x = vertexID % depthTexture.get_width();

// depthDataType is kCVPixelFormatType_DepthFloat16
float depth = depthTexture.read(pos).x * 1000.0f;

float xrw = (pos.x - cameraIntrinsics[2][0]) * depth / cameraIntrinsics[0][0];
float yrw = (pos.y - cameraIntrinsics[2][1]) * depth / cameraIntrinsics[1][1];

float4 xyzw = { xrw, yrw, depth, 1.f };

Refactor this calculation for your point cloud writer and I think you'll have what you're looking for.

like image 131
Legacy-Dev Avatar answered Dec 08 '25 13:12

Legacy-Dev