Detecting user engagement with a robot companion using task and social interaction-based features

Affect sensitivity is of the utmost importance for a robot companion to be able to display socially intelligent behaviour, a key requirement for sustaining long-term interactions with humans. This paper explores a naturalistic scenario in which children play chess with the iCat, a robot companion. A person-independent, Bayesian approach to detect the user's engagement with the iCat robot is presented. Our framework models both causes and effects of engagement: features related to the user's non-verbal behaviour, the task and the companion's affective reactions are identified to predict the children's level of engagement. An experiment was carried out to train and validate our model. Results show that our approach based on multimodal integration of task and social interaction-based features outperforms those based solely on non-verbal behaviour or contextual information (94.79 % vs. 93.75 % and 78.13 %).

[1]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[2]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[3]  Michael Kipp Spatiotemporal Coding in ANVIL , 2008, LREC.

[4]  Ana Paiva,et al.  It's all in the game: Towards an affect sensitive and context aware game companion , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[5]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[6]  I. Poggi Mind, hands, face and body. A goal and belief view of multimodal communication , 2007 .

[7]  Ana Paiva,et al.  Using Anticipation to Create Believable Behaviour , 2006, AAAI.

[8]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  C. Bartneck,et al.  In your face, robot! The influence of a character's embodiment on how users perceive its emotional expressions , 2004 .

[10]  Cristina Conati,et al.  Empirically building and evaluating a probabilistic model of user affect , 2009, User Modeling and User-Adapted Interaction.

[11]  Kostas Karpouzis,et al.  Towards a Real-time Gaze-based Shared Attention for a Virtual Agent , 2008 .

[12]  Xue Yan,et al.  iCat: an animated user-interface robot with personality , 2005, AAMAS '05.

[13]  Peter W. McOwan,et al.  Analysis of affective cues in human-robot interaction: A multi-level approach , 2009, 2009 10th Workshop on Image Analysis for Multimedia Interactive Services.

[14]  P. McOwan,et al.  Affect Recognition for Interactive Companions , 2008 .

[15]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[16]  Ana Paiva,et al.  Are emotional robots more fun to play with? , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[17]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[18]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .