I did something similar to what you are trying to do in my own RealSense project in the Unity game engine. I created a virtual object to track the hand points and then analyzed the positions, angles and distances between the virtual objects that represented the tracked joints.
As well as tracking supported parts of the body (fingertips, middle and base finger joints, palms, tc), you can also work out what other untracked parts of the body would be doing when the tracked parts move. This is known as Inverse Kinematics. Using this principle in my project, I was able to create an entire usable arm by working out what everything up from the hand (wrist, elbow joint, shoulder joints, etc) should be doing when the hand is moved in a certain way.
You can read more about the 'CamAnims' system I built here: