Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to detect open/closed hand using Microsoft Kinect for Windows SDK ver 1.7 C#

Tags:

sdk

kinect

I have recently started using Microsoft Kinect for Windows SDK to program some stuff using the Kinect the device.

I am busting my ass to find a way to detect whether a certain hand is closed or opened.

I saw the Kinect for Windows Toolkit but the documentation is none existent and I can't find a way to make it work.

Does anyone knows of a simple way to detect the hand's situation? even better if it doesn't involve the need to use the Kinect toolkit.

like image 703
Jacob Cohen Avatar asked Nov 18 '25 16:11

Jacob Cohen


2 Answers

This is how I did it eventually:

First things first, we need a dummy class that looks somewhat like this:

public class DummyInteractionClient : IInteractionClient
{
    public InteractionInfo GetInteractionInfoAtLocation(
        int skeletonTrackingId,
        InteractionHandType handType,
        double x,
        double y)
    {
        var result = new InteractionInfo();
        result.IsGripTarget = true;
        result.IsPressTarget = true;
        result.PressAttractionPointX = 0.5;
        result.PressAttractionPointY = 0.5;
        result.PressTargetControlId = 1;

        return result;
    }
}

Then, in the main application code we need to announce about the interactions events handler like this:

this.interactionStream = new InteractionStream(args.NewSensor, new DummyInteractionClient());
                this.interactionStream.InteractionFrameReady += InteractionStreamOnInteractionFrameReady;

Finally, the code to the handler itself:

private void InteractionStreamOnInteractionFrameReady(object sender, InteractionFrameReadyEventArgs e)
    {
        using (InteractionFrame frame = e.OpenInteractionFrame())
        {
            if (frame != null)
            {
                if (this.userInfos == null)
                {
                    this.userInfos = new UserInfo[InteractionFrame.UserInfoArrayLength];
                }

                frame.CopyInteractionDataTo(this.userInfos);
            }
            else
            {
                return;
            }
        }



        foreach (UserInfo userInfo in this.userInfos)
        {
            foreach (InteractionHandPointer handPointer in userInfo.HandPointers)
            {
                string action = null;

                switch (handPointer.HandEventType)
                {
                    case InteractionHandEventType.Grip:
                        action = "gripped";
                        break;

                    case InteractionHandEventType.GripRelease:
                        action = "released";

                        break;
                }

                if (action != null)
                {
                    string handSide = "unknown";

                    switch (handPointer.HandType)
                    {
                        case InteractionHandType.Left:
                            handSide = "left";
                            break;

                        case InteractionHandType.Right:
                            handSide = "right";
                            break;
                    }

                    if (handSide == "left")
                    {
                        if (action == "released")
                        {
                            // left hand released code here
                        }
                        else
                        {
                            // left hand gripped code here
                        }
                    }
                    else
                    {
                        if (action == "released")
                        {
                            // right hand released code here
                        }
                        else
                        {
                            // right hand gripped code here
                        }
                    }
                }
            }
        }
    }
like image 161
Jacob Cohen Avatar answered Nov 21 '25 04:11

Jacob Cohen


SDK 1.7 introduces the interaction concept called "grip". You read about all the KinectInteraction concepts at the following link: http://msdn.microsoft.com/en-us/library/dn188673.aspx

The way Microsoft has implemented this is via an event from an KinectRegion. Among the KinectRegion Events are HandPointerGrip and HandPointerGripRelease, which fire at the appropriate moments. Because the event is coming from the element the hand is over you can easily take appropriate action from the event handler.

Note that a KinectRegion can be anything. The base class is a ContentControl so you can place something as simple as an image to a complex Grid layout within the region to be acted on.

You can find an example of how to use this interaction in the ControlBasics-WPF example, provided with the SDK.

UPDATE:

KinectRegion is simply a fancy ContentControl, which in turn is just a container, which can have anything put inside. Have a look at the ControlBasics-WPF example, at the Kinect for Windows CodePlex, and do a search for KinectRegion in the MainWindow.xaml file. You'll see that there are several controls inside it which are acted upon.

To see how Grip and GripRelease are implemented in this example, it is best to open the solution in Visual Studio and do a search for "grip". They way they do it is a little odd, in my opinion, but it is a clean implementation that flows very well.

like image 32
Nicholas Pappas Avatar answered Nov 21 '25 03:11

Nicholas Pappas



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!