Neuroscientifically-Grounded Research for Improved Human-Robot Interaction

The present study highlights the benefits of using well-controlled experimental designs, grounded in experimental psychology research and objective neuroscientific methods, for generating progress in human-robot interaction (HRI) research. More specifically, we aimed at implementing a well-studied paradigm of attentional cueing through gaze (the so-called “joint attention” or “gaze cueing”) in an HRI protocol involving the iCub robot. Similarly to documented results in gaze-cueing research, we found faster response times and enhanced event-related potentials of the EEG signal for discrimination of cued, relative to uncued, targets. These results are informative for the robotics community by showing that a humanoid robot with mechanistic eyes and human-like characteristics of the face is in fact capable of engaging a human in joint attention to a similar extent as another human would do. More generally, we propose that the methodology of combining neuroscience methods with an HRI protocol, contributes to understanding mechanisms of human social cognition in interactions with robots and to improving robot design, thanks to systematic and well-controlled experimentation tapping onto specific cognitive mechanisms of the human, such as joint attention.

[1]  A. Kingstone,et al.  The eyes have it! Reflexive orienting is triggered by nonpredictive gaze , 1998 .

[2]  T. Kanda,et al.  Infants understand the referential nature of human gaze but not robot gaze. , 2013, Journal of experimental child psychology.

[3]  Agnieszka Wykowska,et al.  Gaze Following Is Modulated by Expectations Regarding Others’ Action Goals , 2015, PloS one.

[4]  G. Woodman,et al.  Event-related potential studies of attention , 2000, Trends in Cognitive Sciences.

[5]  S. Hillyard,et al.  Allocation of visual attention to spatial locations: Tradeoff functions for event-related brain potentials and detection performance , 1990, Perception & psychophysics.

[6]  H. Hawkins,et al.  Visual attention modulates signal detectability. , 1990 .

[7]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[8]  Enoch Callaway,et al.  The effect of attentional effort on visual evoked potential N1 latency , 1982, Psychiatry Research.

[9]  Alessandro Roncone,et al.  A Cartesian 6-DoF Gaze Controller for Humanoid Robots , 2016, Robotics: Science and Systems.

[10]  Giorgio Metta,et al.  YARP: Yet Another Robot Platform , 2006 .

[11]  Robin R. Murphy,et al.  Review of Human Studies Methods in HRI and Recommendations , 2010, Int. J. Soc. Robotics.

[12]  Jan Zwickel,et al.  I See What You Mean: How Attentional Selection Is Shaped by Ascribing Intentions to Others , 2012, PloS one.

[13]  C. Moore,et al.  Joint attention : its origins and role in development , 1995 .

[14]  Giulio Sandini,et al.  The iCub humanoid robot: An open-systems platform for research in cognitive development , 2010, Neural Networks.

[15]  Gordon Cheng,et al.  Autistic traits and sensitivity to human-like features of robot behavior , 2015 .

[16]  Jan Theeuwes,et al.  OpenSesame: An open-source, graphical experiment builder for the social sciences , 2011, Behavior Research Methods.

[17]  M. Tomasello Origins of human communication , 2008 .

[18]  E. Vogel,et al.  Sensory gain control (amplification) as a mechanism of selective attention: electrophysiological and neuroimaging evidence. , 1998, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[19]  G. Cheng,et al.  Embodied artificial agents for understanding human social cognition , 2016, Philosophical Transactions of the Royal Society B: Biological Sciences.

[20]  B. Rossion,et al.  Spatial attention triggered by eye gaze increases and speeds up early visual activity , 2001, Neuroreport.

[21]  J. Findlay,et al.  The effect of visual attention on peripheral discrimination thresholds in single and multiple element displays. , 1988, Acta psychologica.

[22]  E. Wiese,et al.  Beliefs about the Minds of Others Influence How We Process Sensory Information , 2014, PloS one.

[23]  S. Luck,et al.  Electrocortical substrates of visual selective attention , 1993 .

[24]  Edgar Erdfelder,et al.  G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences , 2007, Behavior research methods.

[25]  Brian Scassellati,et al.  Robot gaze does not reflexively cue human attention , 2011, CogSci.

[26]  Giulio Sandini,et al.  Investigating the ability to read others’ intentions using humanoid robots , 2015, Front. Psychol..

[27]  B. Scassellati,et al.  Social eye gaze in human-robot interaction , 2017, J. Hum. Robot Interact..

[28]  Mark H. Johnson,et al.  Selective prefrontal cortex responses to joint attention in early infancy , 2010, Biology Letters.

[29]  S. Baron-Cohen,et al.  Gaze Perception Triggers Reflexive Visuospatial Orienting , 1999 .

[30]  S. Tipper,et al.  Gaze cueing of attention: visual attention, social cognition, and individual differences. , 2007, Psychological bulletin.

[31]  G. Metta,et al.  Robots As Intentional Agents: Using Neuroscientific Methods to Make Robots Appear More Social , 2017, Front. Psychol..

[32]  Chris I. Baker,et al.  Gaze following and joint attention in rhesus monkeys (Macaca mulatta). , 1997 .

[33]  Daniel M. Roberts,et al.  Contextual task difficulty modulates stimulus discrimination: electrophysiological evidence for interaction between sensory and executive processes. , 2012, Psychophysiology.

[34]  Denis Cousineau,et al.  Confidence intervals in within-subject designs: A simpler solution to Loftus and Masson's method , 2005 .