1 Reply Latest reply on Dec 24, 2017 11:52 PM by MartyG

    Hand tracking and recognition

    irving09

      In Hands_Tracking, when I get the three-dimensional coordinates of the 22 knuckles in my hand, can I recognize the new gestures by the geometric positional relationship of these points, and if so, which hand information can be used, for example, from the fingertips The distance to the palm, the degree of opening and closing of each finger and so on. Only consider the simple number 1-5 gesture recognition.

      The palm is VS2013, C++,and RealSense SR300

        • 1. Re: Hand tracking and recognition
          MartyG

          I did something similar to what you are trying to do in my own RealSense  project in the Unity game engine.  I created a virtual object to track the hand points and then analyzed the positions, angles and distances between the virtual objects that represented the tracked joints.

           

          As well as tracking supported parts of the body (fingertips, middle and base finger joints, palms, tc), you can also work out what other untracked parts of the body would be doing when the tracked parts move.  This is known as Inverse Kinematics.  Using this principle in my project, I was able to create an entire usable arm by working out what everything up from the hand (wrist, elbow joint, shoulder joints, etc) should be doing when the hand is moved in a certain way.

           

          You can read more about the 'CamAnims' system I built here:

           

          Using The 'CamAnims' Real-Time Game Animation Technique