Playing a different imitation game: Interaction with an Empathic Android Robot

Current research has identified the need to equip robots with perceptual capabilities that not only recognise objective entities such as visual or auditory objects but that are also capable of assessing the affective evaluations of the human communication partner in order to render the communication situation more natural and social. In equivalence to Watzlawick's statement that "one cannot not communicate" (1968) it has been found that also in human-robot interactions one cannot be not emotional. It is therefore crucial for a robot to understand these affective signals of its communication partner and react towards them. However, up to now, online emotion recognition in realtime, interactive systems has scarcely been attempted as apparently demands concerning robustness and time constraints are very high. In this paper we present an empathic anthropomorphic robot (torso) that mirrors the emotions happiness, fear and neutral as recognised from the speech signal by facial expressions. The recognition component as well as the development of the facial expression generation are described in detail. We report on results from experiments with humans interacting with the empathic robot

[1]  Alex Black,et al.  Motor mimicry as primitive empathy. , 1987 .

[2]  Astrid Paeschke,et al.  A database of German emotional speech , 2005, INTERSPEECH.

[3]  Jannik Fritsch,et al.  Human-like person tracking with an anthropomorphic robot , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[4]  Christian Bauckhage,et al.  An XML based framework for cognitive vision architectures , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[5]  Nicole Chovil Social determinants of facial displays , 1991 .

[6]  P. Watzlawick,et al.  Pragmatics of Human Communication: A Study of Interactional Patterns, Pathologies and Paradoxes , 1964 .

[7]  Jannik Fritsch,et al.  Humanoid robot platform suitable for studying embodied interaction , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Elisabeth André,et al.  Comparing Feature Sets for Acted and Spontaneous Speech in View of Automatic Emotion Recognition , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[9]  Philip J. Willis,et al.  A Layered Dynamic Emotion Representation for the Creation of Complex Facial Expressions , 2003, IVA.

[10]  J. Bavelas,et al.  "I show how you feel": Motor mimicry as a communicative act. , 1986 .

[11]  P. Ekman,et al.  Emotion in the Human Face: Guidelines for Research and an Integration of Findings , 1972 .