In the form of Person Tracking used in the RealSense SDK for Linux, the shoulders are identified with the instruction HEAD_SHOULDERS
The form of Person Tracking used in the Windows RealSense SDK does not seem to track shoulders - just hands, head and chest.
In my own project in the Unity game creation engine, I detect various bone bones that are not normally trackable by looking at the points that can be tracked and working out how the untracked bones should be behaving when the tracked points such as hand and face are moved. I call this method Reverse Thinking, which has been compared to the principle of Inverse Kinematics.
Thanks for the information about the Windows SDK containing shoulder definitions in its joint system in the documentation.
Despite extensive research, I have not been able to find anyone who has been able to access the shoulder joints in the Windows SDK, even though shoulder commands are listed in the documentation. They can only access hands, chest and head.
It is hard to even find a script that references how to call a specific skeleton joint. Most scripts go up to the point where it is about to call the joint and then leave that section blank for others to fill in.
I did manage to find a script that shows the instruction for calling a joint though.
In their script, they use:
So if you wanted to call the left shoulder joint, for example, you might substitute in the shoulder joint instruction from the RealSense documentation.
As I said, I have not seen any scripts that use the shoulder joint. But there is no harm in trying this and seeing what happens.
Once I had found this instruction, I was also able to find an alternative Librealsense script that references the shoulder joints by a RealSense expert called Andre Carlucci..