I created a WPF project in Visual studio. The XAML markup is managed by C# code-behind. What i want to do is create a component on the user interface, which will show a 3D scene. I would like this 3D scene to be managed by Unity, because i need to take advantage of Unity's physics engine. The user must be able to interact with this 3D scene via gestures, recognized by Kinect (for example toss a ball).
Is there any way I can connect WPF, Unity3D and Kinect, so that the user will be able to manipulate 3D scene in such manner? If so, could you provide me with some examples/tutorials? If not, what is the best approach for letting user manipulate 3D scene with Kinect gestures?
I would see How get the Unity3D View In wpf winform as for using Unity3d with wpf. The tricky part will be the Kinect interaction.
You can interface with Kinect through WPF, and use Zigfu for unity. The downside to using WPF for the Kinect interaction is that you won't be able to send data to unity unless you use the new Kinect Client Server System and send all of your information to a webserver, then retrieve it in unity. That is a very bad idea, and will probably not be fast enough and experience serious lag.
The second option, using ZigFu in unity for the Kinect support, is to be considered. The issue with it is if you want to use the Kinect in WPF, then you would have to disconnect from unity. This is the only possible way I see being able to implement
Overall, this is achievable, but will be very difficult. I suggest you use ZigFu, but there are drawbacks to both methods.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With