Comparing Models of Disengagement in Individual and Group Interactions

Changes in type of interaction (e.g., individual vs. group interactions) can potentially impact data-driven models developed for social robots. In this paper, we provide a first investigation in the effects of changing group size in datadriven models for HRI, by analyzing how a model trained on data collected from participants interacting individually performs in test data collected from group interactions, and \textit{vice-versa. Another model combining data from both individual and group interactions is also investigated. We perform these experimentsin the context of predicting disengagement behaviors in children interacting with two social robots. Our results show that a model trained with group data generalizes better to individual participants than the other way around. The mixed model seems a good compromise, but it does not achieve the performance levels of the models trained for a specific type of interaction.

[1]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.

[2]  Hennie Brugman,et al.  Annotating Multi-media/Multi-modal Resources with ELAN , 2004, LREC.

[3]  Candace L. Sidner,et al.  Recognizing engagement in human-robot interaction , 2010, HRI 2010.

[4]  Bilge Mutlu,et al.  Pay attention!: designing adaptive agents that monitor and improve user engagement , 2012, CHI.

[5]  Roderick Murray-Smith,et al.  Focused and casual interactions: allowing users to vary their level of engagement , 2013, CHI.

[6]  Manuel Giuliani,et al.  How can i help you': comparing engagement classification strategies for a robot bartender , 2013, ICMI '13.

[7]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[8]  R. Zajonc SOCIAL FACILITATION. , 1965, Science.

[9]  Changchun Liu,et al.  An empirical study of machine learning techniques for affect recognition in human–robot interaction , 2006, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  J. Read,et al.  Endurability, Engagement and Expectations: Measuring Children’s Fun , 2002 .

[11]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[12]  Eric Horvitz,et al.  Learning to Predict Engagement with a Spoken Dialog System in Open-World Settings , 2009, SIGDIAL Conference.

[13]  Scott E. Hudson,et al.  Spatial and Other Social Engagement Cues in a Child-Robot Interaction: Effects of a Sidekick , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Louis-Philippe Morency,et al.  The effect of head-nod recognition in human-robot conversation , 2006, HRI '06.

[16]  Björn W. Schuller,et al.  Categorical and dimensional affect analysis in continuous input: Current trends and future directions , 2013, Image Vis. Comput..

[17]  Chih-Jen Lin,et al.  Combining SVMs with Various Feature Selection Strategies , 2006, Feature Extraction.

[18]  Susan Bell Trickett,et al.  Social Engagement in Public Places: A Tale of One Robot , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Qianli Xu,et al.  Designing engagement-aware agents for multiparty conversations , 2013, CHI.

[20]  Yukiko I. Nakano,et al.  Estimating user's engagement from eye-gaze behaviors in human-agent conversations , 2010, IUI '10.

[21]  Marek P. Michalowski,et al.  A spatial model of engagement for a social robot , 2006, 9th IEEE International Workshop on Advanced Motion Control, 2006..

[22]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..