A categorical approach to affective gesture recognition

Studies on emotion are currently receiving a lot of attention. The importance of emotion in the development and support of intelligent and social behaviour has been highlighted by studies in psychology and neurology. Hence, the recognition of affective states has also become a critical feature in robot social development, with robots assumed to take on a role as social companion. In this paper, we address the issue of endowing robots with the ability to learn incrementally to recognize the affective state of their human partner by interpreting their gestural cues. We propose a model that can self-organize postural features into affective categories, and use contextual feedback from the partner to drive the learning process.

[1]  P. Ekman The argument and evidence about universals in facial expressions of emotion. , 1989 .

[2]  J. Russell,et al.  Words versus faces in evoking preschool children’s knowledge of the causes of emotions , 2002 .

[3]  S. Grossberg,et al.  Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors , 1976, Biological Cybernetics.

[4]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[5]  D. McNeill Hand and Mind , 1995 .

[6]  S. Kaiser,et al.  Automated coding of facial behavior in human-computer interactions with facs , 1992 .

[7]  Takanori Shibata,et al.  Emergence of emotional behavior through physical interaction between human and robot , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[8]  R. Laban,et al.  The mastery of movement , 1950 .

[9]  Stanton Newman,et al.  THE MAN WHO MISTOOK HIS WIFE FOR A HAT - SACKS,O , 1987 .

[10]  Matthias Scheutz,et al.  Emotional States and Realistic Agent Behaviour , 2000, GAME-ON.

[11]  R. Spitz Hospitalism; an inquiry into the genesis of psychiatric conditions in early childhood. , 1945, The Psychoanalytic study of the child.

[12]  Thomas Wehrle,et al.  Emotion research and AI: Some theoretical and technical issues* , 2022 .

[13]  R. Lane,et al.  Neural Correlates of Levels of Emotional Awareness: Evidence of an Interaction between Emotion and Attention in the Anterior Cingulate Cortex , 1998, Journal of Cognitive Neuroscience.

[14]  O. Sacks,et al.  The Man Who Mistook His Wife for a Hat and Other Clinical Tales , 1985 .

[15]  Cynthia LeRouge,et al.  Intelligent affective interfaces: a user-modeling approach for telemedicine , 2001, HCI.

[16]  Sean A. Spence,et al.  Descartes' Error: Emotion, Reason and the Human Brain , 1995 .

[17]  C. Izard,et al.  Self-Organization of Discrete Emotions, Emotion Patterns, and Emotion-Cognition Relations , 2000 .

[18]  R. Joynt Descartes' Error: Emotion, Reason, and the Human Brain , 1995 .

[19]  Stephen Grossberg,et al.  Adaptive pattern classification and universal recoding: II. Feedback, expectation, olfaction, illusions , 1976, Biological Cybernetics.

[20]  R. H. Phaf,et al.  CALM: Categorizing and learning module , 1992, Neural Networks.

[21]  Luc Berthouze,et al.  Acquiring ontological categories through interaction , 2002 .

[22]  Rosalind W. Picard Affective computing: (526112012-054) , 1997 .

[23]  Kimihiro Suzuki,et al.  Brief Report: Useful Information for Face Perception Is Described with FACS , 2003 .

[24]  Brian Scassellati,et al.  Infant-like Social Interactions between a Robot and a Human Caregiver , 2000, Adapt. Behav..

[25]  Mel Slater,et al.  Emotional personification of humanoids in Immersive Virtual Environments , 2002 .

[26]  Theodore Shapiro,et al.  Measuring Emotions in Infants and Children , 1982 .