An open-ended approach to evaluating Android faces

Expectation and intention understanding through nonverbal behavior is a key topic of interest in socially embedded robots. This study presents the results of an open-ended evaluation method pertaining to the interpretation of Android facial expressions by adult subjects through an online survey with video stimuli. An open-ended question yields more spontaneous answers regarding the situation that can be associated with the synthetic emotional displays of an Android face. The robot used was the Geminoid-DK, while communicating the six basic emotions. The filtered results revealed situations highly relevant to the portrayed facial expressions for the emotions of Surprise, Fear, Anger, and Happiness, and less relevant for the emotions of Disgust, and Sadness. Statistical analysis indicated the existence of a moderate degree of correlation between the emotions of Fear-Surprise, and a high degree of correlation between the pair Disgust-Sadness. With a set of validated facial expressions prior to nonverbal emotional communication, androids and other humanoids can convey more accurate messages to their interaction partners, and overcome the limitations of their current limited affective interface.

[1]  Henrik Schärfe,et al.  A Geminoid as Lecturer , 2012, ICSR.

[2]  Javier R. Movellan,et al.  Learning to Make Facial Expressions , 2009, 2009 IEEE 8th International Conference on Development and Learning.

[3]  D. A. Kenny,et al.  Consensus in personality judgments at zero acquaintance. , 1988, Journal of personality and social psychology.

[4]  Hiroshi Ishiguro,et al.  Evaluating facial displays of emotion for the android robot Geminoid F , 2011, 2011 IEEE Workshop on Affective Computational Intelligence (WACI).

[5]  P. Ekman,et al.  Unmasking the Face: A Guide to Recognizing Emotions From Facial Expressions , 1975 .

[6]  J. Turner Human Emotions: A Sociological Theory , 2007 .

[7]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[8]  Russell Beale,et al.  Affective interaction: How emotional agents affect users , 2009, Int. J. Hum. Comput. Stud..

[9]  Kerstin Dautenhahn,et al.  Collaborating with Kaspar: Using an autonomous humanoid robot to foster cooperative dyadic play among children with autism , 2010, 2010 10th IEEE-RAS International Conference on Humanoid Robots.

[10]  Shuichi Nishio,et al.  Teleoperated android as an embodied communication medium: A case study with demented elderlies in a care facility , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[11]  D. Keltner,et al.  Disgust and the moralization of purity. , 2009, Journal of personality and social psychology.

[12]  Stefan Kopp,et al.  Smile and the world will smile with you - The effects of a virtual agent's smile on users' evaluation and behavior , 2013, Int. J. Hum. Comput. Stud..

[13]  M. Hynes,et al.  AFFECT , 2015, The Atlas of AI.

[14]  A. Chaudhuri,et al.  The Many Faces of a Neutral Face: Head Tilt and Perception of Dominance and Emotion , 2003 .

[15]  Aaron Powers,et al.  Matching robot appearance and behavior to tasks to improve human-robot cooperation , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[16]  Ioannis Pitas,et al.  Facial Expression Recognition in Image Sequences Using Geometric Deformation Features and Support Vector Machines , 2007, IEEE Transactions on Image Processing.

[17]  Fillia Makedon,et al.  Enhanced therapeutic interactivity using social robot Zeno , 2011, PETRA '11.

[18]  Jaak Panksepp,et al.  Toward a general psychobiological theory of emotions , 1982, Behavioral and Brain Sciences.

[19]  T. Dalgleish Basic Emotions , 2004 .

[20]  Dong-Wook Lee,et al.  Difference of Efficiency in Human-Robot Interaction According to Condition of Experimental Environment , 2012, ICSR.

[21]  Ravi Vaidyanathan,et al.  Design and testing of a hybrid expressive face for a humanoid robot , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Evgenios Vlachos,et al.  Android Emotions Revealed , 2012, ICSR.

[23]  J. Russell,et al.  Concept of Emotion Viewed From a Prototype Perspective , 1984 .

[24]  Dilip Kumar Limbu,et al.  A User Trial Study to Understand Play Behaviors of Autistic Children Using a Social Robot , 2012, ICSR.

[25]  Zaven Paré Robot Drama Research: From Identification to Synchronization , 2012, ICSR.

[26]  J. N. Bassili Facial motion in the perception of faces and of emotional expression. , 1978, Journal of experimental psychology. Human perception and performance.

[27]  Evgenios Vlachos,et al.  The Geminoid Reality , 2013, HCI.

[28]  Evgenios Vlachos,et al.  Social Robots as Persuasive Agents , 2014, HCI.

[29]  Danilo De Rossi,et al.  HEFES: An Hybrid Engine for Facial Expressions Synthesis to control human-like androids and avatars , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[30]  Jennifer S. Beer,et al.  Facial expression of emotion. , 2003 .

[31]  Thomas Fuchs,et al.  Body Memory and the Unconscious , 2018, The Oxford Handbook of Philosophy and Psychoanalysis.