Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Kinect v2: Spatial resolution/ depth resolution / camera calibration

For my application, I analyzed the spatial resolution of the Kinect v2.

To analyze the spatial resolution, I recorded a perpendicular and planar plane to a given distance and converted the depth map of the plane to a point cloud. Then I compare a point to his neighbors by calculating the euclidian distance.

Calculating the euclidian distance for this case (1 meter between plane and kinect) the resolution is close to 3 mm between the points. For a plane with 2 meters distance I got a resolution up to 3 mm.

Comparing this to the literature, I think my results are quite bad.

For example Yang et al. got for a plane with distance of 4 meters to the kinect a mean resolution of 4mm (Evaluating and Improving the Depth Accuracy of Kinect for Windows v2)

Here a example for my point cloud of the planar plane (2 meter distance to my kinect):

Plane 2 meters to Kinect v2

Anyone made some observation regarding to spatial resolution of the Kinect v2 or an idea why my resolution is bad like that?

In my opinion i think there went something wrong when converting my depth image to world coordinates. Therefore here a code snipped:

%normalize image points by multiply inverse of K
u_n=(u(:)-c_x)/f_x;
v_n=(v(:)-c_y)/f_y;
% u,v are uv-coordinates of my depth image

%calc radial distortion
r=sqrt(power(u_n,2)+power(v_n,2));
radial_distortion =1.0 + radial2nd * power(r,2) + radial4nd * power(r,4) + radial6nd * power(r,6);

%apply radial distortion to uv-coordinates
u_dis=u_n(:).*radial_distortion;
v_dis=v_n(:).*radial_distortion;

%apply cameramatrix to get undistorted depth point
x_depth=u_dis*f_x+c_x;
y_depth=v_dis*f_y+c_y;

%convert 2D to 3D
X=((x_depth(:)-c_x).*d(:))./f_x;
Y=((y_depth(:)-c_y).*d(:))./f_y;
Z=d;  % d is the given depth value at (u,v)

EDIT: So far I also tried to go include the points directly from coordinate mapper without further calibration steps.

The results regarding to the resolution are still the same. Has anyone any comparing results?

like image 390
JavaNullPointer Avatar asked Apr 16 '16 19:04

JavaNullPointer


1 Answers

@JavaNullPointer, the way you are converting your information to 3D using Kinect v2 isn't still very well accepted by the community.

Also those calculations that you are making are pretty much following the work of Nicholas Burrus - http://burrus.name/index.php/Research/KinectCalibration

For Kinect v2, there isn't still much information on how to this as well. Nevertheless, the new SDK features allows you to save a Kinect Calibration table space.

The procedure is quite simple:

1- you need to save this table information - https://msdn.microsoft.com/en-us/library/windowspreview.kinect.coordinatemapper.getdepthframetocameraspacetable.aspx

2- once you save this information to file, you can then indeed convert your depth points (2D) into 3D camera space.

Here goes the code you should use:

// Get the depth for this pixel
ushort depth = frameData[y * depthFrameDescription.Height + x];

// Get the value from the x/y table
PointF lutValue = this.cameraSpaceTable[y * depthFrameDescription.Height + x];

// create the CameraSpacePoint for this pixel
// values are in meters so convert
CameraSpacePoint csp = new CameraSpacePoint();
csp.X = lutValue.X * depth * 0.001f;
csp.Y = lutValue.Y * depth * 0.001f;
csp.Z = depth * 0.001f;

Also, take a lot at:

https://msdn.microsoft.com/en-us/library/windowspreview.kinect.coordinatemapper.mapdepthframetocameraspace.aspx

or

https://msdn.microsoft.com/en-us/library/windowspreview.kinect.coordinatemapper.mapdepthframetocameraspaceusingibuffer.aspx

In addition, depth, infrared, bodyindex streams are all aligned (same resolution) therefore you just really this. If you need to get the color points as well, you should save that mapping as well. All this information is available in the Kinect MSDN 2.0 website.

I hope you are able to save this information and then re-do this spatial resolution test.

like image 75
16per9 Avatar answered Sep 28 '22 08:09

16per9