Robot initiative in a team learning task increases the rhythm of interaction but not the perceived engagement

We hypothesize that the initiative of a robot during a collaborative task with a human can influence the pace of interaction, the human response to attention cues, and the perceived engagement. We propose an object learning experiment where the human interacts in a natural way with the humanoid iCub. Through a two-phases scenario, the human teaches the robot about the properties of some objects. We compare the effect of the initiator of the task in the teaching phase (human or robot) on the rhythm of the interaction in the verification phase. We measure the reaction time of the human gaze when responding to attention utterances of the robot. Our experiments show that when the robot is the initiator of the learning task, the pace of interaction is higher and the reaction to attention cues faster. Subjective evaluations suggest that the initiating role of the robot, however, does not affect the perceived engagement. Moreover, subjective and third-person evaluations of the interaction task suggest that the attentive mechanism we implemented in the humanoid robot iCub is able to arouse engagement and make the robot's behavior readable.

[1]  Mohamed Chetouani,et al.  Interpersonal Synchrony: A Survey of Evaluation Methods across Disciplines , 2012, IEEE Transactions on Affective Computing.

[2]  Masaki Ogino,et al.  Cognitive Developmental Robotics: A Survey , 2009, IEEE Transactions on Autonomous Mental Development.

[3]  Giulio Sandini,et al.  An experimental evaluation of a novel minimum-jerk cartesian controller for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Philippe Gaussier,et al.  Using the Rhythm of Nonverbal Human–Robot Interaction as a Signal for Learning , 2011, IEEE Transactions on Autonomous Mental Development.

[5]  付伶俐 打磨Using Language,倡导新理念 , 2014 .

[6]  F. Kaplan,et al.  The challenges of joint attention , 2006 .

[7]  Andrew N. Meltzoff,et al.  Perspective-Taking and its Foundation in Joint Attention , 2011 .

[8]  Gerald Friedland,et al.  Estimating the dominant person in multi-party conversations using speaker diarization strategies , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[9]  A. Meltzoff 'Like me': a foundation for social cognition. , 2007, Developmental science.

[10]  Reinhard Klette Feature Detection and Tracking , 2014 .

[11]  Pietro Perona Feature Detection and Tracking , 1997 .

[12]  N. Emery,et al.  The eyes have it: the neuroethology, function and evolution of social gaze , 2000, Neuroscience & Biobehavioral Reviews.

[13]  Philip R. Cohen,et al.  Referring as a Collaborative Process , 2003 .

[14]  Kyle B. Reed,et al.  Replicating Human-Human Physical Interaction , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[15]  B. Biddle RECENT DEVELOPMENTS IN ROLE THEORY , 1986 .

[16]  Jean Scholtz,et al.  Common metrics for human-robot interaction , 2006, HRI '06.

[17]  Dirk Heylen,et al.  How Turn-Taking Strategies Influence Users' Impressions of an Agent , 2010, IVA.

[18]  Stefanos Nikolaidis,et al.  Optimization of Temporal Dynamics for Adaptive Human-Robot Interaction in Assembly Manufacturing , 2012, Robotics: Science and Systems.

[19]  Matthias Scheutz,et al.  Investigating multimodal real-time patterns of joint attention in an hri word learning task , 2010, HRI 2010.

[20]  Thierry Chaminade,et al.  Comparing the effect of humanoid and human face for the spatial orientation of attention , 2013, Front. Neurorobot..

[21]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Minoru Asada,et al.  Reproducing Interaction Contingency Toward Open-Ended Development of Social Actions: Case Study on Joint Attention , 2010, IEEE Transactions on Autonomous Mental Development.

[23]  Dirk Heylen,et al.  How Agents' Turn-Taking Strategies Influence Impressions and Response Behaviors , 2011, Presence Teleoperators Virtual Environ..

[24]  H. Bekkering,et al.  Joint action: bodies and minds moving together , 2006, Trends in Cognitive Sciences.

[25]  Timothy F. Cootes,et al.  Feature Detection and Tracking with Constrained Local Models , 2006, BMVC.

[26]  Gabriel Skantze,et al.  Exploring the effects of gaze and pauses in situated human-robot interaction , 2013, SIGDIAL Conference.

[27]  Michael Gasser,et al.  The Development of Embodied Cognition: Six Lessons from Babies , 2005, Artificial Life.

[28]  M. Posner,et al.  Orienting of Attention* , 1980, The Quarterly journal of experimental psychology.

[29]  Scott S. Wiltermuth,et al.  Synchrony and Cooperation , 2009, Psychological science.

[30]  U. Castiello,et al.  Different action patterns for cooperative and competitive behaviour , 2007, Cognition.

[31]  Lisa M. Oakes,et al.  Manual object exploration and learning about object features in human infants , 2012, 2012 IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL).

[32]  David G. Rand,et al.  Why We Cooperate , 2014 .

[33]  Peter Ford Dominey,et al.  I Reach Faster When I See You Look: Gaze Effects in Human–Human and Human–Robot Face-to-Face Cooperation , 2012, Front. Neurorobot..

[34]  Kerstin Fischer,et al.  The impact of the contingency of robot feedback on HRI , 2013, 2013 International Conference on Collaboration Technologies and Systems (CTS).

[35]  Sukhvinder S. Obhi,et al.  Moving together: toward understanding the mechanisms of joint action , 2011, Experimental Brain Research.

[36]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[37]  J. Piaget Play, dreams and imitation in childhood , 1951 .

[38]  N. Sebanz,et al.  Psychological research on joint action: Theory and data , 2011 .

[39]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[40]  Markus Lappe,et al.  When humanoid robots become human-like interaction partners: corepresentation of robotic actions. , 2012, Journal of experimental psychology. Human perception and performance.

[41]  Pierre-Yves Oudeyer,et al.  Object Learning Through Active Exploration , 2014, IEEE Transactions on Autonomous Mental Development.

[42]  Alessandra Sciutti,et al.  Anticipatory gaze in human-robot interactions , 2012 .

[43]  Etienne Burdet,et al.  Slaves no longer: review on role assignment for human–robot joint motor action , 2014, Adapt. Behav..

[44]  M. Crocker,et al.  Investigating joint attention mechanisms through spoken human–robot interaction , 2011, Cognition.

[45]  E. Schegloff,et al.  A simplest systematics for the organization of turn-taking for conversation , 1974 .

[46]  Hideaki Kuzuoka,et al.  Precision timing in human-robot interaction: coordination of head movement and utterance , 2008, CHI.

[47]  J. Randall Flanagan,et al.  The role of observers' gaze behaviour when watching object manipulation tasks: predicting and evaluating the consequences of action , 2013, Philosophical Transactions of the Royal Society B: Biological Sciences.

[48]  Andrea Lockerd Thomaz,et al.  Effects of responding to, initiating and ensuring joint attention in human-robot interaction , 2011, 2011 RO-MAN.

[49]  M. Tomasello,et al.  Understanding and sharing intentions: The origins of cultural cognition , 2005, Behavioral and Brain Sciences.

[50]  Minoru Asada,et al.  Acquisition of joint attention through natural interaction utilizing motion cues , 2007, Adv. Robotics.

[51]  Matthias Scheutz,et al.  Investigating multimodal real-time patterns of joint attention in an HRI word learning task , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[52]  Martin Buss,et al.  Role determination in human-human interaction , 2009, World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems.

[53]  Giulio Sandini,et al.  The iCub Platform: A Tool for Studying Intrinsically Motivated Learning , 2013, Intrinsically Motivated Learning in Natural and Artificial Systems.

[54]  C. Frith The role of metacognition in human social interactions , 2012, Philosophical Transactions of the Royal Society B: Biological Sciences.

[55]  Julie C. Sedivy,et al.  Subject Terms: Linguistics Language Eyes & eyesight Cognition & reasoning , 1995 .

[56]  Yukie Nagai,et al.  Learning to grasp with parental scaffolding , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[57]  Michael Morales,et al.  Following the direction of gaze and language development in 6-month-olds , 1998 .

[58]  Andrea Lockerd Thomaz,et al.  Turn-Taking Based on Information Flow for Fluent Human-Robot Interaction , 2011, AI Mag..

[59]  Matthew W. Crocker,et al.  The Dynamics of Referential Speaker Gaze: Order is Important, Synchronization, Not So Much , 2010 .

[60]  Candace L. Sidner,et al.  Recognizing engagement in human-robot interaction , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[61]  Mohamed Chetouani,et al.  Perception and human interaction for developmental learning of objects and affordances , 2012, 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012).

[62]  Alex Pentland,et al.  Modeling Functional Roles Dynamics in Small Group Interactions , 2013, IEEE Transactions on Multimedia.

[63]  Mohamed Chetouani,et al.  Assessment of the communicative and coordination skills of children with Autism Spectrum Disorders and typically developing children using social signal processing , 2013 .