Your assessment of the Person Tracking feature's limitations is accurate. In my own RealSense project, I extrapolate a full skeleton by analyzing a handful of body points such as the palm and spine and running calculations that decide what the other untracked body points would be doing when those tracked points move in a certain way.
That does not help though with your dilemma of how to simulate hand open-close. Since the palm is tracked to provide the hand status, In the Unity game engine's implementation of RealSense hand tracking, if the hand is open and the palm visible to the camera then this counts as Hand Open. If the palm is not visible to the camera because of the hand being closed then this is regarded as Hand Closed.
Since the Hand Closed status is probably partly decided upon by analyzing hand joint positions (something the RealSense SDK for Linux cannot do), the easiest way to simulate Hand Closed may be to equate it to Hand Lost ... if the camera cannot detect the palm then it is automatically classed as being closed.
I could not find a way to tell if the hand was closed. The Person Lib still returns the hand position if the hand is closed.
Is there any way to get more details about the detection from where it could be possible to get hand status(open/close)?
I was unfortunately not able to find a built-in way to detect a hand-closed state in the RealSense SDK For Linux. It does though support a pointing gesture, and comes with a sample program for doing that. Perhaps you could use open hand / point instead of open hand / closed hand?