1 Reply Latest reply on Mar 20, 2017 11:56 AM by MartyG

    'My Father's Face' for PC and Project Alloy - Tech Trailer 8

    MartyG

      Hi everyone,

       

      The eighth in the series of 'Tech Trailer' technology previews for Sambiglyon's forthcoming PC and Project Alloy game 'My Father's Face' has now been released.  In this video, we highlight control of various kinds of technology in the in-world environment using hand waves and presses.

       

       

      Link: 'My Father's Face' Tech Trailer 8 for PC and Project Alloy - YouTube

       

      Previous Tech Trailers in the series are also available on our YouTube channel.

        • 1. Re: 'My Father's Face' for PC and Project Alloy - Tech Trailer 8
          MartyG

          I thought it would be interesting to add some background information about the technology in the game trailer above.

           

          *  The arm movements are not pre-canned animations triggered by gestures.  They are 100% real-time, using a custom animation system that I designed.  The arms have multiple-direction shoulder jointing so that they can swing up-down, inward and outward, and forward and back, replicating a real human shoulder in every way, and all of the capabilities of a real arm.

           

          *  The joints follow a philosophy i designed that I call Reverse Thinking.  Since the RealSense camera cannot track arm and shoulder bone joints, the virtual arm movements are instead calculated by a system that looks at the player's hand movements and decides how the rest of the arm that is attached to the hand should be affected by the hand's motion. This is the reverse of the human arm, where the shoulder primarily determines how the hand attached to the end of it moves.

           

          *  The basic hand motions are tracked by the 'TrackingAction' hand tracking script supplied with the RealSense SDK's 'Unity Toolkit'.  In 'My Father's Face' though, their only purpose is to move a simple controller object - a cube that position-moves for arm controls, and a sphere that rotation-moves for face and torso controls.  The coordinate or angle values generated are fed into the custom animation system - which we call CamAnim - and converted by math formulas into a value between 0 and 1.

           

          *  The 'CamAnim Value' is then linked to a Unity animation clip of an action such as an avatar arm that goes from fully lowered (position '0' on the animation clip's timeline) to fully raised above the head ('1' at the end of the clip's timeline).  The CamAnim value that is generated by the data fed into the formulas by the TrackingAction inputs allows the animation clip's time-line to be rewound or forwarded in real-time, producing live motion capture animation in the avatar.  About a dozen of these controllers working simultaneously animates the entire body in almost 1:1 mirroring of the player's real body - arms, waist, legs and facial expressions.

           

          *  The CamAnim formulas and the motions encoded in the animation clips are the result of two years worth of regular anatomical study of how the real body moves.  Human limbs are deceptively complex, and motion that seems simple when observed is actually the result of optical illusions created by a few arm parts all moving at the same time.  To truly replicate human arm movement, it was necessary to go down a number of experimental dead-ends until the true source of a particular motion was discovered, so that it could then be correctly animated.

           

          For example, when the arm is lifted up, you might think that the arm is simply rotating in its joint on the spot like a toy action figure, and our prototypes did use that movement model for a while.  Eventually, we learned through our anatomy studies that an arm lift is made up of a diagonal-forward-up tilt of the shoulder, with the arm simultaneously moving sidewards in the outward direction as the arm lifts away from the armpit.

           

          *  By analyzing facial CamAnim Values, you can determine the approximate emotional state of the player.  If their eyebrow CamAnim has a value near '0' then they are likely to be happy, because the brows are barely moved from their fully-raised starting position.  If the CamAnim value is near or at '1' though then the player is likely to be sad, because their real-life frown is mirrored in the timeline position of the eyebrow animation clip. 

           

          Facial CamAnim values can also be combined together for more complex emotional analysis.  If only the mouth = '1' then the player may be 'Sad'. as their lips are fully downturned.  If the mouth CamAnim = '1' and the eyebrow CamAnim also = '1' though then they are likely to be in an 'Angry' emotional state, because their mouth is down and their eyebrows fully furrowed. 

           

          *  The same methodology can be used to work out the physical pose that the player's body is in, by analyzing the CamAnims of the waist, neck, arm and leg joints.

           

          The CamAnims system enables almost any pose possible in the human body to be replicated with an avatar in real-time, giving the person controlling it nearly as many possibilities for interaction as their real body provides them - from operating a virtual computer, so performing the attack of their favorite anime character.  Kamehameha!

           

          We will keep posting more tech details about the development of 'My Father's Face' as it proceeds so that others can replicate our techniques in their own RealSense projects.  So keep an eye on the RealSense forum!

          1 of 1 people found this helpful