Facial expressions as feedback cue in human-robot interaction—a comparison between human and automatic recognition performances

Facial expressions are one important nonverbal communication cue, as they can provide feedback in conversations between people and also in human-robot interaction. This paper presents an evaluation of three standard pattern recognition techniques (active appearance models, gabor energy filters, and raw images) for facial feedback interpretation in terms of valence (success and failure) and compares the results to the human performance. The used database contains videos of people interacting with a robot by teaching the names of several objects to it. After teaching, the robot should term the objects correctly. The subjects reacted to its answer while showing spontaneous facial expressions, which were classified in this work. One main result is that an automatic classification of facial expressions in terms of valence using simple standard pattern recognition techniques is possible with an accuracy comparable to the average human classification rate, but with a high variance between different subjects, likewise to the human performance.

[1]  Nicu Sebe,et al.  Authentic facial expression analysis , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[2]  Heiko Wersing,et al.  Feedback interpretation based on facial expressions in human-robot interaction , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[3]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[5]  Julia Hirschberg,et al.  Identifying User Corrections Automatically in Spoken Dialogue Systems , 2001, NAACL.

[6]  Timothy F. Cootes,et al.  Active Appearance Models , 1998, ECCV.

[7]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[8]  Maja Pantic,et al.  Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  P. Ekman,et al.  Strong evidence for universals in facial expressions: a reply to Russell's mistaken critique. , 1994, Psychological bulletin.

[10]  Marian Stewart Bartlett,et al.  Classifying Facial Actions , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Emiel Krahmer,et al.  The dual of denial: Two uses of disconfirmations in dialogue and their prosodic correlates , 2002, Speech Commun..

[12]  Marc Hanheide,et al.  Automatic Initialization for Facial Analysis in Interactive Robotics , 2008, ICVS.

[13]  Andrew Sears,et al.  Discovering Cues to Error Detection in Speech Recognition Output: A User-Centered Approach , 2006, J. Manag. Inf. Syst..

[14]  Oscar Déniz-Suárez,et al.  The ENCARA System for Face Detection and Normalization , 2003, IbPRIA.

[15]  Gwen Littlewort,et al.  Toward Practical Smile Detection , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Gwen Littlewort,et al.  Fully Automatic Facial Action Recognition in Spontaneous Behavior , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[17]  Nicu Sebe,et al.  Emotion Recognition Based on Joint Visual and Audio Cues , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[18]  Enrique Muñoz,et al.  Recognising facial expressions in video sequences , 2007, Pattern Analysis and Applications.

[19]  J. G. Taylor,et al.  Emotion recognition in human-computer interaction , 2005, Neural Networks.

[20]  Loïc Kessous,et al.  Modeling naturalistic affective states via facial and vocal expressions recognition , 2006, ICMI '06.

[21]  Sebastian Lang,et al.  BIRON - The Bielefeld Robot Companion , 2004 .