Affect Recognition for Interactive Companions

Affect sensitivity is an important requirement for artificial companions to be capable of engaging in social interaction with human users. This paper provides a general overview of some of the issues arising from the design of an affect recognition framework for artificial companions. Limitations and challenges are discussed with respect to other capabilities of companions and real world scenarios for affect sensitive human-companion interaction.

[1]  B. Stein,et al.  The Merging of the Senses , 1993 .

[2]  S. Baron-Cohen Mindblindness: An Essay on Autism and Theory of Mind , 1997 .

[3]  K. Scherer Psychological models of emotion. , 2000 .

[4]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[5]  Nicu Sebe,et al.  Affective multimodal human-computer interaction , 2005, ACM Multimedia.

[6]  Ana Paiva,et al.  Feeling and Reasoning: A Computational Model for Emotional Characters , 2005, EPIA.

[7]  Johannes Wagner,et al.  Integrating information from speech and physiological signals to achieve emotional sensitivity , 2005, INTERSPEECH.

[8]  Peter Robinson,et al.  Generalization of a Vision-Based Computational Model of Mind-Reading , 2005, ACII.

[9]  H. Meeren,et al.  Rapid perceptual integration of facial expression and emotional body language. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[10]  Kostas Karpouzis,et al.  Emotion recognition through facial expression analysis based on a neurofuzzy network , 2005, Neural Networks.

[11]  Mohammed Yeasin,et al.  Recognition of facial expressions and measurement of levels of interest from video , 2006, IEEE Transactions on Multimedia.

[12]  Yuxiao Hu,et al.  Training combination strategy of multi-stream fused hidden Markov model for audio-visual affect recognition , 2006, MM '06.

[13]  Maja Pantic,et al.  Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[14]  Laurence Devillers,et al.  Real-life emotions detection with lexical and paralinguistic cues on human-human call center dialogs , 2006, INTERSPEECH.

[15]  Loïc Kessous,et al.  Modeling naturalistic affective states via facial and vocal expressions recognition , 2006, ICMI '06.

[16]  Peter W. McOwan,et al.  A real-time automated system for the recognition of human facial expressions , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[17]  Chih-Wei Chang,et al.  A Robot as a Teaching Assistant in an English Class , 2006 .

[18]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[19]  Shaogang Gong,et al.  Beyond Facial Expressions: Learning Human Emotion from Body Gestures , 2007, BMVC.

[20]  Hatice Gunes,et al.  Bi-modal emotion recognition from expressive face and body gestures , 2007, J. Netw. Comput. Appl..

[21]  Klaus R. Scherer,et al.  Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus , 2007, ACII.

[22]  Ginevra Castellano,et al.  Recognising Human Emotions from Body Movement and Gesture Dynamics , 2007, ACII.

[23]  Gwen Littlewort,et al.  Faces of pain: automated measurement of spontaneousallfacial expressions of genuine and posed pain , 2007, ICMI '07.

[24]  Ning Wang,et al.  Creating Rapport with Virtual Agents , 2007, IVA.

[25]  Hatice Gunes,et al.  How to distinguish posed from spontaneous smiles using geometric features , 2007, ICMI '07.

[26]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[27]  K. Scherer,et al.  Automated Analysis of Body Movement in Emotionally Expressive Piano Performances , 2008 .

[28]  Loïc Kessous,et al.  Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech , 2008, Affect and Emotion in Human-Computer Interaction.

[29]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..