Automatic analysis of affective postures and body motion to detect engagement with a game companion

The design of an affect recognition system for socially perceptive robots relies on representative data: human-robot interaction in naturalistic settings requires an affect recognition system to be trained and validated with contextualised affective expressions, that is, expressions that emerge in the same interaction scenario of the target application. In this paper we propose an initial computational model to automatically analyse human postures and body motion to detect engagement of children playing chess with an iCat robot that acts as a game companion. Our approach is based on vision-based automatic extraction of expressive postural features from videos capturing the behaviour of the children from a lateral view. An initial evaluation, conducted by training several recognition models with contextualised affective postural expressions, suggests that patterns of postural behaviour can be used to accurately predict the engagement of the children with the robot, thus making our approach suitable for integration into an affect recognition system for a game companion in a real world scenario.

[1]  Ana Paiva,et al.  It's all in the game: Towards an affect sensitive and context aware game companion , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[2]  Gary R. Bradski,et al.  Learning OpenCV - computer vision with the OpenCV library: software that sees , 2008 .

[3]  Ian Witten,et al.  Data Mining , 2000 .

[4]  Tetsuo Ono,et al.  Body Movement Analysis of Human-Robot Interaction , 2003, IJCAI.

[5]  C. Breazeal Role of expressive behaviour for robots that learn from people , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[6]  Andrea Kleinsmith,et al.  A categorical approach to affective gesture recognition , 2003, Connect. Sci..

[7]  Ana Paiva,et al.  Are emotional robots more fun to play with? , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  K. Scherer,et al.  Cues and channels in emotion recognition. , 1986 .

[9]  Kostas Karpouzis,et al.  Investigating shared attention with a virtual agent using a gaze-based interface , 2010, Journal on Multimodal User Interfaces.

[10]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[11]  Michael Kipp Spatiotemporal Coding in ANVIL , 2008, LREC.

[12]  Xue Yan,et al.  iCat: an animated user-interface robot with personality , 2005, AAMAS '05.

[13]  Ana Paiva,et al.  Affect recognition for interactive companions: challenges and design in real world scenarios , 2009, Journal on Multimodal User Interfaces.

[14]  Rosalind W. Picard,et al.  Automated Posture Analysis for Detecting Learner's Interest Level , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.

[15]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Dana Kulic,et al.  Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[17]  Chrystopher L. Nehaniv,et al.  Behaviour delay and robot expressiveness in child-robot interactions: A user study on interaction kinesics , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Reid G. Simmons,et al.  Rhythmic attention in child-robot dance play , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[19]  Christopher E. Peters,et al.  Socially perceptive robots: Challenges and concerns , 2010 .

[20]  Hatice Gunes,et al.  Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[21]  I. Poggi Mind, hands, face and body. A goal and belief view of multimodal communication , 2007 .

[22]  Ginevra Castellano,et al.  Recognising Human Emotions from Body Movement and Gesture Dynamics , 2007, ACII.

[23]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[24]  Boone Rt,et al.  Children's Decoding of Emotion in Expressive Body Movement: The Development of Cue Attunement. , 1998 .

[25]  Fumihide Tanaka,et al.  Socialization between toddlers and robots at an early childhood education center , 2007, Proceedings of the National Academy of Sciences.

[26]  Yukiko I. Nakano,et al.  Estimating user's engagement from eye-gaze behaviors in human-agent conversations , 2010, IUI '10.

[27]  P. Robinson,et al.  Natural Affect Data: Collection and Annotation , 2011 .

[28]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[29]  Candace L. Sidner,et al.  Recognizing engagement in human-robot interaction , 2010, HRI 2010.

[30]  H. Wallbott Bodily expression of emotion , 1998 .

[31]  Arthur C. Graesser,et al.  AUTOMATIC DETECTION OF LEARNER'S AFFECT FROM GROSS BODY LANGUAGE , 2009, Appl. Artif. Intell..

[32]  Peter Robinson,et al.  Detecting Affect from Non-stylised Body Motions , 2007, ACII.

[33]  J. Cunningham,et al.  Children's decoding of emotion in expressive body movement: the development of cue attunement. , 1998, Developmental psychology.

[34]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques, 3rd Edition , 1999 .

[35]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.