Towards Affect Sensitive and Socially Perceptive Companions

This chapter investigates affect sensitivity as an important requirement for socially perceptive companions. Challenges and issues arising in the design of an affect recognition framework for artificial companions are identified. A multi-level approach to the analysis of non-verbal affective expressions in human-companion interaction is also presented. The chapter ends with a discussion on the importance of affect recognition for the generation of empathic reactions and the establishment of long-term human-companion relationships.

[1]  K. Scherer,et al.  Automated Analysis of Body Movement in Emotionally Expressive Piano Performances , 2008 .

[2]  Alex Pentland,et al.  Social signal processing: state-of-the-art and future perspectives of an emerging domain , 2008, ACM Multimedia.

[3]  Kinshuk,et al.  Advanced Learning Technologies , 2001 .

[4]  Scherer,et al.  On the use of actor portrayals in research on emotional expression , 2010 .

[5]  Cristina Conati,et al.  Affective interactions: the computer in the affective loop , 2005, IUI.

[6]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.

[7]  Ana Paiva,et al.  Affect recognition for interactive companions: challenges and design in real world scenarios , 2009, Journal on Multimodal User Interfaces.

[8]  B. Robins,et al.  Robot-mediated joint attention in children with autism : A case study in robot-human interaction , 2004 .

[9]  Rosalind W. Picard Affective Computing , 1997 .

[10]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Caridakis,et al.  Body gesture and facial expression analysis for automatic affect recognition , 2010 .

[12]  K. Scherer Which Emotions Can be Induced by Music? What Are the Underlying Mechanisms? And How Can We Measure Them? , 2004 .

[13]  James C. Lester,et al.  Achieving Affective Impact: Visual Emotive Communication in Lifelike Pedagogical Agents , 1999 .

[14]  Peter W. McOwan,et al.  Ubiquitous social perception abilities for interaction initiation in human-robot interaction , 2010, AFFINE '10.

[15]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[16]  Maurizio Mancini,et al.  Analysis of Emotional Gestures for the Generation of Expressive Copying Behaviour in an Embodied Agent , 2009, Gesture Workshop.

[17]  Timothy W. Bickmore,et al.  Establishing and maintaining long-term human-computer relationships , 2005, TCHI.

[18]  Russell Beale,et al.  Affect and Emotion in Human-Computer Interaction, From Theory to Applications , 2008, Affect and Emotion in Human-Computer Interaction.

[19]  Carlos Martinho,et al.  Closing the loop: from affect recognition to empathic interaction , 2010, AFFINE '10.

[20]  Peter Robinson,et al.  When my robot smiles at me Enabling human-robot rapport via real-time head gesture mimicry , 2009 .

[21]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Peter Robinson,et al.  Natural affect data — Collection & annotation in a learning context , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[23]  K. Scherer,et al.  A blueprint for affective computing: A sourcebook , 2010 .

[24]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[25]  Ana Paiva,et al.  Inter-ACT: an affective and contextually rich multimodal video corpus for studying interaction with robots , 2010, ACM Multimedia.

[26]  Loïc Kessous,et al.  Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech , 2008, Affect and Emotion in Human-Computer Interaction.

[27]  Kostas Karpouzis,et al.  Investigating shared attention with a virtual agent using a gaze-based interface , 2010, Journal on Multimodal User Interfaces.

[28]  C. Breazeal Role of expressive behaviour for robots that learn from people , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[29]  Antonio Camurri,et al.  Gesture-Based Communication in Human-Computer Interaction , 2003, Lecture Notes in Computer Science.

[30]  Chih-Wei Chang,et al.  A Robot as a Teaching Assistant in an English Class , 2006 .

[31]  Nicolas Courty,et al.  Gesture in Human-Computer Interaction and Simulation , 2006 .

[32]  Brian Scassellati,et al.  Active vision for sociable robots , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[33]  Anton Nijholt,et al.  Virtual rap dancer: invitation to dance , 2006, CHI Extended Abstracts.

[34]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[35]  Stacy Marsella,et al.  Virtual Rapport , 2006, IVA.

[36]  Rosalind W. Picard Affective computing: (526112012-054) , 1997 .

[37]  Christopher E. Peters,et al.  Socially perceptive robots: Challenges and concerns , 2010 .

[38]  A. van Knippenberg,et al.  Mimicry and Prosocial Behavior , 2004, Psychological science.

[39]  Ana Paiva,et al.  iCat: an affective game buddy based on anticipatory mechanisms , 2008, AAMAS.

[40]  P. Ekman,et al.  Approaches To Emotion , 1985 .

[41]  Dana Kulic,et al.  Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[42]  Fumihide Tanaka,et al.  Socialization between toddlers and robots at an early childhood education center , 2007, Proceedings of the National Academy of Sciences.

[43]  Yukiko I. Nakano,et al.  Estimating user's engagement from eye-gaze behaviors in human-agent conversations , 2010, IUI '10.

[44]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[45]  Candace L. Sidner,et al.  Recognizing engagement in human-robot interaction , 2010, HRI 2010.

[46]  Caroline A. Bartel,et al.  The Collective Construction of Work Group Moods , 2000 .

[47]  Stefan Kopp,et al.  Imitation Games with an Artificial Agent: From Mimicking to Understanding Shape-Related Iconic Gestures , 2003, Gesture Workshop.

[48]  Stacy Marsella,et al.  Natural Behavior of a Listening Agent , 2005, IVA.