You do not need to use that scripting to move an object in Unity, as the R2 SDK has a tool set called the Unity Toolkit that contains pre-made tracking scripts that you can just drag and drop into objects. One of these, 'TrackingAction', can detect a facial expression and allow an object to move or rotate a certain amount based on the settings that you have put into TrackingAction's menus and settings in the Inspector panel of Unity.
I have written a large range of detailed step by step guides on using TrackingAction in Unity, and adapting its code to improve its capabilities.
I will also show you a very old video from my own Unity game project where I control an avatar character's face parts with TrackingAction. I am a little embarrassed to show it as my body animation technology is far more advanced now, but this is the best video example video of animating directly with TrackingAction that I have (I wrote my own Unity custom real-time animation system called CamAnims after that).
Here is details of the more advanced CamAnims system for further reading.
Thank you for answering.
I got a little video to watch.
Since it seems that you can get hints, I will have you see the details later.
In the future I am thinking to change the face of the 3D model according to emotion of person.
1 of 1 people found this helpful
There are two ways that you could detect emotions:
1. Use a 'SendMessageAction' component from the Unity Toolkit to trigger an action when a certain face expression is detected, such as a smile, open mouth or closed eyes.
2. Use the CamAnims method described in my CamAnims article to use the values of an object to calculate what the current emotion expressed by the player in front of the camera is.
I tried the way of 1.
However, with this method I felt it difficult to achieve the level I wanted to do.
Because the setting items of FacialExpression were insufficient.
I thought that it would be better to use PersonExpressionsEnum for judgment of emotion value than FaceExpression.
In particular, I think that it is interesting if the model interlocks by judging emotions of "SADNESS" "SURPRISE" "ANGER".
When trying to judge these emotions, I felt that FaceExpression can not deal with it.
So, after this I will try out if it can not be done by the way of 2.
FaceExpression Intel® RealSense™ SDK 2016 R2 Documentation
PersonExpressionsEnum Intel® RealSense™ SDK 2016 R2 Documentation
I tried the way of 2.
It is acquiring how much each part of the face moved from the initial value, and making a judgment by combining them.
I got the movement of the mouth, so if I get the movement of the eyebrows, I may be able to do similar things.
However, since it seems to be difficult to independently create judgments of each emotion, I thought that "if realsense makes judgment of emotion it can take an easy way".
I do not think that Intel will create a new emotion detection program for RealSense. I am sure that they would be happy if somebody else made one and shared it with the RealSense community though.
I understood that there was no emotion detection program and it was helpful.