Items with no label
3335 Discussions

Hand tracking and recognition

yzhan72
Beginner
1,243 Views

In Hands_Tracking, when I get the three-dimensional coordinates of the 22 knuckles in my hand, can I recognize the new gestures by the geometric positional relationship of these points, and if so, which hand information can be used, for example, from the fingertips The distance to the palm, the degree of opening and closing of each finger and so on. Only consider the simple number 1-5 gesture recognition.

The palm is VS2013, C++,and RealSense SR300

0 Kudos
1 Reply
MartyG
Honored Contributor III
249 Views

I did something similar to what you are trying to do in my own RealSense project in the Unity game engine. I created a virtual object to track the hand points and then analyzed the positions, angles and distances between the virtual objects that represented the tracked joints.

As well as tracking supported parts of the body (fingertips, middle and base finger joints, palms, tc), you can also work out what other untracked parts of the body would be doing when the tracked parts move. This is known as Inverse Kinematics. Using this principle in my project, I was able to create an entire usable arm by working out what everything up from the hand (wrist, elbow joint, shoulder joints, etc) should be doing when the hand is moved in a certain way.

You can read more about the 'CamAnims' system I built here:

0 Kudos
Reply