0 Replies Latest reply on Jul 13, 2017 7:25 AM by MartyG

    Using The 'CamAnims' Real-Time Game Animation Technique

    MartyG

       

      Difficulty level: Intermediate. The article assumes a familiarity with creating animation clips on a timeline. 

       

      INTRODUCTION

       

      In Fall / Autumn 2017, an exciting new open-world game called 'My Father's Face' is due to be released for PC and Intel Project Alloy “merged reality” headset, with configuration options also included to support comfortable play on other VR headsets such as Oculus and HTC Vive.

       

      Set in a high-gravity alternate world where animals evolved equally with humans and squirrel-people are the dominant species, 'My Father's Face' utilizes real-time 'CamAnims' animation technology three years in the making. CamAnims enables live control of a full-body player character that can move, look around and use its limbs as naturally as a real person. 

       

      This means that little tutorial guidance is required, because since the virtual body mirrors the player's own body movements almost 1:1, their life experience informs them how to interact with the game world. Even young children can play alongside an elder family member and feel empowered at the degree of control they have over their virtual representation of themselves.

       

      In this blog article, we will look at how the CamAnims system takes an input value from a control device (motion-tracking camera, keyboard, mouse, joypad, etc) and translates it into a point to jump to on an animation's timeline, enabling real-time forwarding and rewinding along the timeline as control inputs are made.

       

      Although the Unity engine is used for the development of 'My Father's Face', the CamAnims system uses ordinary math calculations that are adaptable to scripting in any engine with a timeline-based animation system so long as you adjust the calculations to reflect the minimum and maximum values used to jump to the start or end point of an animation timeline in a particular engine.

       

      ANIMATION IN THE UNITY ENGINE

       

      In the Unity engine's animation system, animations are configured on a timeline sheet that can played with script instructions from any point on the timeline so long as the value is between '0' (the start of the animation clip) and '1' (the end of the clip). It does not matter if the clip is 3 seconds long or 3000 seconds – it will always have a start point of 0 and an end point of 1.

       

      1.jpg

       

      Usefully, the value of control axis inputs – at least, in the Unity engine – are Ilko usually in the '0' and '1' minimum and maximum range. This makes the inputs perfectly compatible with the scale used by the animation timeline.

       

      On analog joypad inputs, the value moves towards or away from '1' depending on how much analog button pressure is applied, or how far in a certain direction an analog stick is moved. 

       

      With digital inputs, the input value usually snaps immediately to the '1' end-value when the button is pressed, although you can set a digital input to increment a value progressively towards a maximum point for as long as that input has a value of '1' (i.e it is being pressed, making its status “true”).

       

      In the example below, the depth of the player charactter's crouch towards the ground is controlled by an 'If' logic statement. The crouch position is stored in a variable called 'CrouchDepth', and if the crouch control is pressed (True) then the value of CrouchDepth is incremented or decreased by '0.15' until a maximum value is reached.

       

      if (Input.GetAxis ("Crouch Down") == true) {

       

      CrouchDepth = CrouchDepth + 0.15f;

       

      }

       

      // Ensure that the crouch variable can never be less than 0 or greater than 1

       

      if (Input.GetAxis ("Crouch Up") == true) {

       

      CrouchDepth = CrouchDepth - 0.15f;

       

      }

       

      if (Input.GetAxis (“Crouch Down”) > 1) {

       

      CrouchDepth = 1;

       

      }

       

      LINKING THE INPUT VALUE TO THE ANIMATION

       

      Once a variable has been set up to store a particular control input in, we need to link that variable to an object in the project that we wish to influence using an animation clip.

       

      In the Unity engine specifically, objects that need to be controlled with Unity's animation system need to have an 'Animator' component placed inside them. Other game engines may have a different means of animating objects. 

       

      The Animator component is configured to look at a particular 'Animation Controller' component, within which animation clips can be stored. It is good practice to group together similar kinds of animation inside Animation Controllers whose name reflects that category of animation (e.g Face_Controller, Arms_Controller, etc).

       

      2.jpg

       

      The Animator component will animate any object in the same way so long as it has the same name and the same hierarchy of objects around it. You can therefore easily use the same set of animations for multiple objects so long as they have the same names and hierarchy, even if they are textured differently or have additional pieces attached to them.

       

      For example, you can have two player characters with the same body internally but with different outward appearances). As long as the objects that the Animators are stored in have unique reference tag names, even if their objects have the same name, then if you link to the Animator by tag-name instead of object-name then the same animation can be used in numerous different objects simultaneously without conflict.

       

      To link an input value to an animation, two pieces of information are required:

       

      - A reference for the location within the project hierarchy of the object containing the Animator component. As mentioned above, a tag is the best way to make the linkage, since it ensures that you can use the same animation with multiple objects.

       

      - An instruction to play an animation clip at a certain point on its timeline, linking to tht clip via the Animation Controller defined in the object's Animator component.

       

      In Unity specifically, a unique tag name can be defined by going to the 'Tag' option at the top of the 'Inspector' window (which is set to Untagged by default) and left-clicking on the tag name to drop a menu down. A new custom tag can then be added by clicking the 'Add Tag' option at the base of the tag list.

       

      Clicking this option displays a list of the project's tags, albeit in text boxes so that their names can be edited. At the base of the list is a '+ icon. Clicking this creates a new slot to define a tag, after which you click the 'Save' option to store it.

       

      3.jpg

       

      4.jpg

       

      Once the tag is defined, highlight the object that you wish to assign that tag to by again clicking on the tag name at the top of the Inspector window, and selecting the newly defined tag name from the drop-down list.

       

      5.jpg

       

      A useful tip: if you have many tags in your project that take ages to scroll through to reach the bottom of the list, you can reach the bottom instantly by pressing the 'up' arrow on the keyboard as soon as you open the list. It was years of frustrating list-scrolling before I discovered that invaluable little feature!

       

      SCRIPTING THE LINKAGE IN UNITY

       

      With the tag defined, you should now tell the game engine how to use that tag to communicate with the object that was assigned the tag.

       

      In the Unity engine, this is accomplished with the 'GameObject.FindWithTag' instruction. 

       

      To tell Unity how to find the object, we must first create a reference to it in the header section of the script, using the GameObject example. This example, for finding the faceplate object of Player 1's character, uses the C# language:

       

      6.jpg

       

      We will also need to define an 'Animator' reference so that Unity can communicate with the Animator component within the object that we direct it to. For easy script-reading, I recommend using the same name as the object reference, with '_Animator' added to the end of the name to make clear what its function is and which object that the specific Animator component that we are targeting is associated with.

       

      6a.jpg

       

      In the Start() section of the script – which dictates what should happen when the script is activated – the GameObject.FindWithTag instruction is used to tell Unity which tag is associated with the object that we defined above. In our example, we tell Unity to look for an object that is tagged with the name 'Player 1 Faceplate by putting the tag name within the quotation marks of the GameObject.FindWithTag instruction.

       

      7.jpg

       

      Next, the Animator reference that we defined above needs to have a project hierarchy location assigned to it. The format for this is the reference relating to the tag name, followed by the instruction

       

      GetComponent<Animator>();

       

      6b.jpg

       

      PLAYING THE ANIMATION

       

      With these references defined so that Unity can locate the Animator component, we are now finally ready to provide Unity with the instruction to use the real-time control input value to forward and rewind through an animation clip's timeline.

       

      The play instruction must be placed in the Update() section of the script, or in a similar ype of function that loops continuously until stopped (unlike functions such as Start() and Awake(), which only run once when the script activates). 

       

      This is because the control input value needs to be repeatedly checked for changes so that it can pass updates to the animation clip to alter the timeline position that it is playing at a given moment in time.

       

      A typical example of a Play instruction for an animation clip in Unity takes the format:

       

      <Animator reference name>.Play(“<name of animation clip”>;

       

      For jumping to a specific point on the timeline, a slightly more complex format is used:

       

      <Animator reference name>.Play(“<name of animation clip”>, 0, <position on timeline between 0 and 1>;

       

      So for our player character faceplate example, if we wanted to jump to the middle point of the animation clip (0.5) then the instruction would look like this:

       

      Faceplate_Player1_Animator.Play(“Faceplate_Move”), 0, 0.5f);

       

      Note that in Unity, if a value has a decimal point then it should have a lower-case 'f' placed after the value to indiate to Unity that it is a 'floating point' number and not a rounded integer value.

       

      With the Play instruction in its current form though, all the instruction would do is make the referenced animation clip jump to its middle point and then keep the animation at that point in its timeline without changing. We need to connect the Play instruction to the live-updating control input value so that the timeline can respond immediately to the player's inputs.

       

      To accomplish this, we simply substitute the static timeline position value for our control-value variable name. For example:

       

      Faceplate_Player1_Animator.Play(“Faceplate_Move”), 0, CrouchDepth);

       

      Now whenever an input is made to the particular control input referenced in the instruction, the animation timeline will immediately change, updating the animation of the object containing the Animator component that we linked to.

       

      ADVANCED CONCEPTS

       

      The live-controlled animation principles that we have demonstrated in this article are relatively simple. When they are combined with logic statements such as 'If' and 'Else' though then they become truly flexible and powerful.

       

      You can monitor control input values and initiate particular events if the values fall within a certain range. You can Ilko create complex control systems that are intuitively easy to use by treating animations as a series of channels, shutting off certain channels (each representing an animation) to allow one particular animation to play, and then close that animation off and enable a different animation channel to be active.

       

      The arms in 'My Father's Face' have multiple-direction shoulder jointing so that they can swing up-down, inward and outward, and forward and back, replicating a real human shoulder in every way, and all of the capabilities of a real arm.

       

      In the example of 'If' control logic below from 'My Father's Face', multiple conditions are checked when the 'walk forward' control is activated, in order to determine whether the player character's 'walk forward' leg animation should be permitted to be played, or whether other leg animations such as turning and side-stepping have already claimed the right to play their particular animation instead until their control input becomes false.

       

      8.jpg

       

      The CamAnims system enables almost any pose possible in the human body to be replicated with an avatar in real-time, giving the person controlling it nearly as many possibilities for interaction as their real body provides them - from operating a virtual computer, so performing the attack of their favorite anime or superhero character.

       

      Another application of control value analysis in projects where the player character's facial expressions are controllable is the detection of the player's current emotional state. 

      In 'My Father's Face', if the player's eyebrow CamAnim has a value near '0' then they are likely to be happy, because the brows are barely moved from their fully-raised starting position. If the CamAnim value is near or at '1' though then the player is likely to be sad, because their real-life frown is mirrored in the timeline position of the eyebrow animation clip.

       

      Facial CamAnim values can also be combined together for more complex emotional analysis. If only the mouth = '1' then the player may be 'Sad'. as their lips are fully downturned. If the mouth CamAnim = '1' and the eyebrow CamAnim also = '1' though then they are likely to be in an 'Angry' emotional state, because their mouth is down and their eyebrows fully furrowed.

       

      The same methodology can be used to work out the physical pose that the player's body is in, by analyzing the CamAnims of the waist, neck, arm and leg joints.

       

      CONTROL EVERYTHING

       

      In 'My Father's Face', we take full advantage of this flexibility by using an innovative control layout to enable the fingers to flow across the pad to control alll actions on a single pad. 

       

      Instead of assigning walking to an upward push of the left stick, we assign it to the left digital bumper button, whilst turning left and right is assigned to the left and right analog trigger buttons instead of the right stick. This layout frees up both sticks to control the left and right arms of the player character independently at the same time whilst in motion. It only takes a minute to get used to and is as effortless to use as the traditional stick-navigation system.

       

      To assist the player in using all the control possibilities at their fingertips to maximum effect, we also have an optional 'auto walk' button so that the game automatically takes care of walking and running until auto-walk is pressed again to toggle it off, enabling the player to take their finger away from the walk button and focus on pulling off incredible feats with the arm sticks and other action controls (for example, leaping into the air, changing direction in mid-air and pulling down on the sticks to swing the arms down towards a duel opponent below).

       

      ACCESSIBILITY FOR ALL

       

      Another advantage of the CamAnims live animation system is that multiple control device types can be connected to the same control variables, so that the player can mix and match controllers to the way that is best for their play style and physical capabilities. 

       

      One player may play only with a joypad, another may use a mouse and keyboard combo, and someone else may use physical joypad controls for walking and running whilst using a motion-tracking camera such as Intel RealSense for controlling the rest of their player character's body. 

       

      This flexibility means that all ranges of interaction capability, from young children with low motor skills to severely disabled players with accessibility devices can take part in the game world on an equal footing.

       

      CONCLUSION

       

      This introductory article only touches the tip of the iceberg of the CamAnims system's true power and complexity and what can be achieved with it by building upon these foundational principles (for example, integrating it with further control and monitoring systems and more complex input value management mechanisms like we have in 'My Father's Face').  If you enjoyed this article and found it useful, please provide feedback in the comments with specific questions or requests for follow-up articles on the subject. Thanks for reading!