Evaluating facial displays of emotion for the android robot Geminoid F

With android robots becoming increasingly sophisticated in their technical as well as artistic design, their non-verbal expressiveness is getting closer to that of real humans. Accordingly, this paper presents results of two online surveys designed to evaluate a female android's facial display of five basic emotions. We prepared both surveys in English, German, and Japanese language allowing us to analyze for inter-cultural differences. Accordingly, we not only found that our design of the emotional expressions “fearful” and “surprised” were often confused, but also that many Japanese participants seemed to confuse “angry” with “sad” in contrast to the German and English participants. Although similar facial displays portrayed by the model person of Geminoid F achieved higher recognition rates overall, portraying fearful has been similarly difficult for the model person. We conclude that improving the android's expressiveness especially around the eyes would be a useful next step in android design. In general, these results could be complemented by an evaluation of dynamic facial expressions of Geminoid F in future research.

[1]  A. J. Fridlund IS THERE UNIVERSAL RECOGNITION OF EMOTION FROM FACIAL EXPRESSION? A REVIEW OF THE CROSS-CULTURAL STUDIES , 1994 .

[2]  Tetsuo Ono,et al.  Development and evaluation of an interactive humanoid robot "Robovie" , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[3]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[4]  H. Ishiguro,et al.  EXPLORING THE UNCANNY VALLEY WITH GEMINOID HI-1 IN A REAL-WORLD APPLICATION , 2010 .

[5]  Qiang Ji,et al.  Active and dynamic information fusion for facial expression understanding from image sequences , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Tomoko Koda,et al.  Cross-Cultural Study of Avatars' Facial Expressions and Design Considerations Within Asian Countries , 2007, IWIC.

[7]  C. Bartneck,et al.  In your face, robot! The influence of a character's embodiment on how users perceive its emotional expressions , 2004 .

[8]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[9]  Karl F. MacDorman,et al.  Does Japan really have robot mania? Comparing attitudes by implicit and explicit measures , 2008, AI & SOCIETY.

[10]  Hiroshi Ishiguro,et al.  Android science: Toward a new cross-interdisciplinary framework , 2005 .

[11]  Mark Grimshaw,et al.  Facial expression of emotion and perception of the Uncanny Valley in virtual characters , 2011, Comput. Hum. Behav..

[12]  H. Ishiguro,et al.  Geminoid: Teleoperated Android of an Existing Person , 2007 .

[13]  H. Ishiguro,et al.  The uncanny advantage of using androids in cognitive and social science research , 2006 .

[14]  Stefan Kopp,et al.  Do You Know How I Feel? Evaluating Emotional Display of Primary and Secondary Emotions , 2008, IVA.

[15]  M. Yuki,et al.  Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States , 2007 .

[16]  Takayuki Kanda,et al.  Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial , 2004, Hum. Comput. Interact..

[17]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[18]  Ipke Wachsmuth,et al.  Affective computing with primary and secondary emotions in a virtual human , 2009, Autonomous Agents and Multi-Agent Systems.

[19]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[20]  Christoph Bartneck,et al.  How Convincing is Mr. Data's Smile: Affective Expressions of Machines , 2001, User Modeling and User-Adapted Interaction.

[21]  Hiroshi Ishiguro,et al.  Toward social mechanisms of android science: A CogSci 2005 Workshop: 25 and 26 July 2005, Stresa, Italy , 2006 .