Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Official Kinect SDK and Unity3d

Tags:

unity3d

kinect

Does anyone know anything about using Kinect input for Unity3d with the official SDK? I've been assigned a project to try and integrate these two, but my super doesn't want me to use the open Kinect stuff. Last news out of the Unity site was that Kinect SDK requires 4.0 .Net and Unity3D only takes 3.5

Workarounds? Point me toward resources if you know anything about it please.

like image 268
Ele Munjeli Avatar asked Jun 22 '11 20:06

Ele Munjeli


2 Answers

The OpenNI bindings for Unity are probably the best way to go. The NITE skeleton is more stable than the Microsoft Kinect SDK, but still requires calibration (PrimeSense mentioned that they'll have a calibration-free skeleton soon).

There are bindings to OpenNI from the Kinect SDK, that make the Kinect SDK work like SensorKinect, this module also exposes The KinectSDK calibration-free skeleton as an OpenNI module:

https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/

Because the KinectSDK also provides ankles and wrists, and OpenNI already supported it (even though NITE didn't support it) all the OpenNI stuff including Unity character rigs that had included the ankles and wrists just all work and without calibration. The KinectSDK bindings for OpenNI also support using NITE's skeleton and hand trackers, with one caveat, it seems like the NITE gesture detection aren't working with the Kinect SDK yet. The work-around when using the KinectSDK with NITE's handGenerator is to use skeleton-free tracking to provide you with a hand point. Unfortunately you lose the ability to just track hands when your body isn't visible to the sensor.

Still, NITE's skeleton seems more stable and more responsive than the KinectSDK.

like image 72
Amir Avatar answered Sep 23 '22 10:09

Amir


How much of the raw Kinect data do you need? For a constrained problem, like just getting limb articulation, have you thought about using an agnostic communication schema like a TcpClient. Just create a simple TCP server, in .net 4.0, that links to the Kinect SDK and pumps out packets w/ the info you need every 30ms or something. Then just write a receiving client in Unity. I had a similar problem with a different SDK. I haven't tried the Kinect though so maybe my suggestion is overkill.

If you want real-time depth/color data you might need something a bit faster, perhaps using Pipes?

like image 40
Jerdak Avatar answered Sep 21 '22 10:09

Jerdak