Towards Realistic Facial Behaviour in Humanoids - Mapping from Video Footage to a Robot Head

Rehabilitation robotics and physical therapy could greatly benefit from engaging and motivating, robotic caregivers which respond in accordance to patients' emotional and social cues. Recent studies indicate that human-machine interactions are more believable and memorable when a physical entity is present, provided that the machine behaves in a realistic manner. It is desirable to adopt face-to-face communication because it is the most natural and efficient way of exchanging information and does not require users to alter their habits. Towards this end, we describe a process for animating a robot head, based on video input of a human head. We map from the 2D coordinates of feature points into the robot's servo space using Partial Least Squares (PLS). Learning is done using a small set of keyframes manually created by an animator. The method is efficient, robust to tracking errors and independent of the scale of the face being tracked.

[1]  Ken Perlin,et al.  A platform for affective agent research , 2004 .

[2]  H. Martens Partial least squares regression (PLSR) , 1993 .

[3]  Britta Wrede,et al.  Playing a different imitation game: Interaction with an Empathic Android Robot , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[4]  Hiroshi Ishiguro,et al.  Android science: Toward a new cross-interdisciplinary framework , 2005 .

[5]  Timothy F. Cootes,et al.  Active Appearance Models , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Parke,et al.  Parameterized Models for Facial Animation , 1982, IEEE Computer Graphics and Applications.

[7]  H. Abdi Partial Least Square Regression PLS-Regression , 2007 .

[8]  H. Kozima,et al.  Using Robots for the Study of Human Social Development , 2005 .

[9]  R. Tobias An Introduction to Partial Least Squares Regression , 1996 .

[10]  C. Darwin The Expression of Emotion in Man and Animals , 2020 .

[11]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[12]  Lisa Gralewski,et al.  Statistical synthesis of facial expressions for the portrayal of emotion , 2004, GRAPHITE '04.

[13]  Maja J Matarić,et al.  Socially Assistive Robotics for Post-stroke Rehabilitation Journal of Neuroengineering and Rehabilitation Socially Assistive Robotics for Post-stroke Rehabilitation , 2007 .

[14]  H. Ishiguro,et al.  Assessing Human Likeness by Eye Contact in an Android Testbed , 2005 .

[15]  Jing Xiao,et al.  Multimodal coordination of facial action, head rotation, and eye motion during spontaneous smiles , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[16]  Cynthia Breazeal,et al.  Learning From and About Others: Towards Using Imitation to Bootstrap the Social Understanding of Others by Robots , 2005, Artificial Life.

[17]  Janet Beavin Bavelas,et al.  The psychology of facial expression: Faces in dialogue , 1997 .

[18]  Christopher J. Taylor,et al.  A Framework for Automated Landmark Generation for Automated 3D Statistical Model Construction , 1999, IPMI.

[19]  J. Bavelas,et al.  Listener Responses as a Collaborative Process: The Role of Gaze , 2002 .

[20]  Cory D. Kidd,et al.  Sociable robots : the role of presence and task in human-robot interaction , 2003 .

[21]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[22]  Cynthia Breazeal,et al.  Effect of a robot on user perceptions , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[23]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[24]  J. Bavelas,et al.  Listeners as co-narrators. , 2000, Journal of personality and social psychology.

[25]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[26]  Andrew Olney,et al.  Upending the Uncanny Valley , 2005, AAAI.

[27]  Atsuo Takanishi,et al.  Human-like Head Robot WE-3RV for Emotional Human-Robot Interaction , 2002 .

[28]  Jeffrey F. Cohn,et al.  The Timing of Facial Motion in posed and Spontaneous Smiles , 2003, Int. J. Wavelets Multiresolution Inf. Process..

[29]  Fumio Hara,et al.  A basic study on dynamic control of facial expressions for Face Robot , 1994, Proceedings of 1994 3rd IEEE International Workshop on Robot and Human Communication.