Learning to recognize affective body postures

Robots are assuming an increasingly important role in our society. They now become pets and help support children healing. In other words, they are now trying to entertain an active and affective communication with human agents. However, up to now, such systems have primarily relied on the human agents' ability to empathize with the system. Changes in the behavior of the system could therefore result in changes of mood or behavior in the human partner. But current systems do not seem to react to users, or only in clearly pre-defined ways. In that sense, current systems miss the bi-directionality typical to human social interaction. Social interaction is characterized by a multi-channel communication, in which each actor captures and reacts to signals by the other actor. To this aim, a computer or a robot has to be able to capture and interpret signals sent by the human partner in order to achieve social interaction. One of the most important channels of communication is physical interaction. The body is used to interpret the affective state of an interlocutor. This paper describes experiments we carried out to study the importance of body language in affective communication. The results of the experiments led us to develop a system that can incrementally learn to recognize affective states from body postures.

[1]  R. H. Phaf,et al.  CALM: Categorizing and learning module , 1992, Neural Networks.

[2]  Antonio Camurri,et al.  KANSEI analysis of dance performance , 1999, IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).

[3]  Mel Slater,et al.  Emotional personification of humanoids in Immersive Virtual Environments , 2002 .

[4]  Luc Berthouze,et al.  Acquiring ontological categories through interaction , 2002 .

[5]  Christine L. Lisetti,et al.  Modeling Multimodal Expression of User’s Affective Subjective Experience , 2002, User Modeling and User-Adapted Interaction.

[6]  Catherine Pelachaud,et al.  Embodied contextual agent in information delivering application , 2002, AAMAS '02.

[7]  Brian Scassellati,et al.  How to build robots that make friends and influence people , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[8]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[9]  R. Laban,et al.  The mastery of movement , 1950 .

[10]  T. Nakata Generation of Whole-Body Expressive Movement Based on Somatical Theories , 2002 .

[11]  Takanori Shibata,et al.  Emergence of emotional behavior through physical interaction between human and robot , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[12]  Patricia O'Neill-Brown,et al.  Setting the Stage for the Culturally Adaptive Agent , 1997 .

[13]  J. Cassell,et al.  Embodied conversational agents , 2000 .

[14]  Shuji Hashimoto,et al.  Intelligent agent system for human-robot interaction through artificial emotion , 1998, SMC'98 Conference Proceedings. 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.98CH36218).

[15]  S. Fukuda,et al.  Extracting emotion from voice , 1999, IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).

[16]  Takeo Kanade,et al.  Recognizing Action Units for Facial Expression Analysis , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Rob Reilly,et al.  Theories for Deep Change in Affect-sensitive Cognitive Machines: A Constructivist Model , 2002, J. Educ. Technol. Soc..

[18]  Hiroshi Yamada,et al.  American-Japanese cultural differences in judgements of emotional expressions of different intensities , 2002 .