It's not all written on the robot's face

Past work on creating robots that can make convincing emotional expressions has concentrated on the quality of those expressions, and on assessing people's ability to recognize them in neutral contexts, without any strong emotional valence. It would be interesting to find out whether observers' judgments of the facial cues of a robot would be affected by a surrounding emotional context. This paper takes its inspiration from the contextual effects found on our interpretation of the expressions on human faces and computer avatars, and looks at the extent to which they also apply to the interpretation of the facial expressions of a mechanical robot head. The kinds of contexts that affect the recognition of robot emotional expressions, the circumstances under which such contextual effects occur, and the relationship between emotions and the surrounding situation, are observed and analyzed. Design implications for believable emotional robots are drawn.

[1]  C. Izard The face of emotion , 1971 .

[2]  Bram Vanderborght,et al.  Expressing Emotions with the Social Robot Probo , 2010, Int. J. Soc. Robotics.

[3]  J. Russell The psychology of facial expression: Reading emotions from and into faces: Resurrecting a dimensional-contextual perspective , 1997 .

[4]  R. V. Ham,et al.  ANTY: the development of an intelligent huggable robot for hospitalized children , 2006 .

[5]  P. Ekman Universals and cultural differences in facial expressions of emotion. , 1972 .

[6]  Hiroshi Ishiguro,et al.  Evaluating facial displays of emotion for the android robot Geminoid F , 2011, 2011 IEEE Workshop on Affective Computational Intelligence (WACI).

[7]  G. A. Mendelsohn,et al.  Affect grid : A single-item scale of pleasure and arousal , 1989 .

[8]  Amanda J. C. Sharkey,et al.  Listening to Sad Music While Seeing a Happy Robot Face , 2011, ICSR.

[9]  J. Russell,et al.  The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology , 2005, Development and Psychopathology.

[10]  Craig A. Smith,et al.  The psychology of facial expression: A Componential Approach to the meaning of facial expressions , 1997 .

[11]  Maja J. Mataric,et al.  Human Perception of Audio-Visual Synthetic Character Emotion Expression in the Presence of Ambiguous and Conflicting Information , 2009, IEEE Transactions on Multimedia.

[12]  Amanda J. C. Sharkey,et al.  Contextual Recognition of Robot Emotions , 2011, TAROS.

[13]  John J. B. Allen,et al.  The handbook of emotion elicitation and assessment , 2007 .

[14]  J. Mayer,et al.  The experience and meta-experience of mood. , 1988, Journal of personality and social psychology.

[15]  Thomas S. Huang,et al.  Real-time speech-driven face animation with expressions using neural networks , 2002, IEEE Trans. Neural Networks.

[16]  J. M. Carroll,et al.  Do facial expressions signal specific emotions? Judging emotion from the face in context. , 1996, Journal of personality and social psychology.

[17]  M. Bradley,et al.  The International Affective Picture System (IAPS) in the study of emotion and attention. , 2007 .

[18]  Chris Melhuish,et al.  Design and Testing of Hybrid Expressive Face for the BERT2 Humanoid Robot , 2010 .

[19]  J. Vroomen,et al.  The perception of emotions by ear and by eye , 2000 .

[20]  Maja J. Mataric,et al.  Human perception of synthetic character emotions in the presence of conflicting and congruent vocal and facial expressions , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[21]  Gitte Lindgaard,et al.  Interpreting Human and Avatar Facial Expressions , 2009, INTERACT.

[22]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[23]  P. Ekman Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life , 2003 .

[24]  Russell Beale,et al.  Psychological responses to simulated displays of mismatched emotional expressions , 2008, Interact. Comput..

[25]  Chris Melhuish,et al.  Shared Gaussian Process Latent Variable Models for Handling Ambiguous Facial Expressions , 2009 .

[26]  Chris Melhuish,et al.  Facial behaviour mapping - From video footage to a robot head , 2008, Robotics Auton. Syst..