Nonverbal communication in socially assistive human-robot interaction

Socially assistive robots provide assistance to human users through interactions that are inherently social. This category includes robot tutors that provide students with personalized one-on-one lessons (Ramachandran, Litoiu, & Scassellati, 2016), robot therapy assistants that help mediate social interactions between children with ASD and adult therapists (Scassellati, Admoni, & Matarić, 2012), and robot coaches that motivate children to make healthy eating choices (Short et al., 2014).

[1]  Kai Vogeley,et al.  A Non-Verbal Turing Test: Differentiating Mind from Machine in Gaze-Based Social Interaction , 2011, PloS one.

[2]  Hiroshi Ishiguro,et al.  Head motion during dialogue speech and nod timing control in humanoid robots , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  John R. Anderson,et al.  ACT-R: A Theory of Higher Level Cognition and Its Relation to Visual Attention , 1997, Hum. Comput. Interact..

[4]  E. Torres-Jara,et al.  Challenges for Robot Manipulation in Human Environments , 2006 .

[5]  H. H. Clark Coordinating with each other in a material world , 2005 .

[6]  Mel Slater,et al.  The impact of eye gaze on communication using humanoid avatars , 2001, CHI.

[7]  Gabriel Skantze,et al.  Perception of gaze direction for situated interaction , 2012, Gaze-In '12.

[8]  Andrea Lockerd Thomaz,et al.  Teaching and working with robots as a collaboration , 2004, Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems, 2004. AAMAS 2004..

[9]  Peter Robinson,et al.  Cooperative gestures: Effective signaling for humanoid robots , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Sean Andrist,et al.  Conversational Gaze Aversion for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Matthew W. Crocker,et al.  Visual attention in spoken human-robot interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[12]  Rachid Alami,et al.  Physiological and subjective evaluation of a human-robot object hand-over task. , 2011, Applied ergonomics.

[13]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[14]  Junji Yamato,et al.  A probabilistic inference of multiparty-conversation structure based on Markov-switching models of gaze patterns, head directions, and utterances , 2005, ICMI '05.

[15]  Reid G. Simmons,et al.  Estimating human interest and attention via gaze analysis , 2013, 2013 IEEE International Conference on Robotics and Automation.

[16]  V. Bruce,et al.  Reflexive visual orienting in response to the social attention of others , 1999 .

[17]  Bilge Mutlu,et al.  Robot behavior toolkit: Generating effective social behaviors for robots , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Brent Lance,et al.  The Rickel Gaze Model: A Window on the Mind of a Virtual Human , 2007, IVA.

[19]  S. Tipper,et al.  Gaze cueing of attention: visual attention, social cognition, and individual differences. , 2007, Psychological bulletin.

[20]  Jaap Ham,et al.  Combining Robotic Persuasive Strategies: The Persuasive Power of a Storytelling Robot that Uses Gazing and Gestures , 2015, Int. J. Soc. Robotics.

[21]  Norman I. Badler,et al.  A Review of Eye Gaze in Virtual Agents, Social Robotics and HCI: Behaviour Generation, User Interaction and Perception , 2015, Comput. Graph. Forum.

[22]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[23]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[24]  Gordon Cheng,et al.  “Mask-bot”: A life-size robot head using talking head animation for human-robot communication , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[25]  Matthew R. Walter,et al.  Understanding Natural Language Commands for Robotic Navigation and Mobile Manipulation , 2011, AAAI.

[26]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[27]  Brian Scassellati,et al.  Robot nonverbal behavior improves task performance in difficult collaborations , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[28]  C. Frith,et al.  Social interaction modifies neural response to gaze shifts. , 2007, Social cognitive and affective neuroscience.

[29]  A. Tapus,et al.  Children with Autism Social Engagement in Interaction with Nao, an Imitative Robot - A Series of Single Case Experiments , 2012 .

[30]  Gérard Bailly,et al.  Gaze, conversational agents and face-to-face communication , 2010, Speech Commun..

[31]  C. Moore,et al.  Joint attention : its origins and role in development , 1995 .

[32]  Bilge Mutlu,et al.  Designing persuasive robots: How robots might persuade people using vocal and nonverbal cues , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[33]  Jon Driver,et al.  Adult's Eyes Trigger Shifts of Visual Attention in Human Infants , 1998 .

[34]  K.R. Thorisson,et al.  Layered modular action control for communicative humanoids , 1997, Proceedings. Computer Animation '97 (Cat. No.97TB100120).

[35]  B. Scassellati Imitation and mechanisms of joint attention: a developmental structure for building social skills on a humanoid robot , 1999 .

[36]  N. Sarkar,et al.  A Step Towards Developing Adaptive Robot-Mediated Intervention Architecture (ARIA) for Children With Autism , 2013, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[37]  Hans-Peter Seidel,et al.  Annotated New Text Engine Animation Animation Lexicon Animation Gesture Profiles MR : . . . JL : . . . Gesture Generation Video Annotated Gesture Script , 2007 .

[38]  J Hyönä,et al.  Pupil Dilation as a Measure of Processing Load in Simultaneous Interpretation and Other Language Tasks , 1995, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[39]  B. Scassellati,et al.  Robots for use in autism research. , 2012, Annual review of biomedical engineering.

[40]  Sven Behnke,et al.  Towards a humanoid museum guide robot that interacts with multiple persons , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[41]  Marc Hanheide,et al.  Human-Oriented Interaction With an Anthropomorphic Robot , 2007, IEEE Transactions on Robotics.

[42]  Brian J. Scholl,et al.  The psychophysics of chasing: A case study in the perception of animacy , 2009, Cognitive Psychology.

[43]  D. Feil-Seifer,et al.  Defining socially assistive robotics , 2005, 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005..

[44]  Vasant Srinivasan High Social Acceptance of Head Gaze Loosely Synchronized with Speech for Social Robots , 2014 .

[45]  Sean Andrist,et al.  A head-eye coordination model for animating gaze shifts of virtual characters , 2012, Gaze-In '12.

[46]  Matthew W. Hoffman,et al.  A probabilistic model of gaze imitation and shared attention , 2006, Neural Networks.

[47]  Danilo De Rossi,et al.  Designing and Evaluating a Social Gaze-Control System for a Humanoid Robot , 2014, IEEE Transactions on Human-Machine Systems.

[48]  Alan Kingstone,et al.  Does gaze direction really trigger a reflexive shift of spatial attention? , 2005, Brain and Cognition.

[49]  Siddhartha S. Srinivasa,et al.  Deliberate Delays During Robot-to-Human Handovers Improve Compliance With Gaze Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[50]  J. Cassell,et al.  Turn taking vs. Discourse Structure: How Best to Model Multimodal Conversation , 1998 .

[51]  Bilge Mutlu,et al.  Learning-Based Modeling of Multimodal Behaviors for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[52]  Brian Scassellati,et al.  Modeling communicative behaviors for object references in human-robot interaction , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[53]  J. Hietanen,et al.  Does facial expression affect attention orienting by gaze direction cues? , 2003, Journal of experimental psychology. Human perception and performance.

[54]  G. Butterworth,et al.  How the eyes, head and hand serve definite reference , 2000 .

[55]  L. Chelazzi,et al.  My eyes want to look where your eyes are looking: Exploring the tendency to imitate another individual's gaze , 2002, Neuroreport.

[56]  Rachid Alami,et al.  A Human-Aware Manipulation Planner , 2012, IEEE Transactions on Robotics.

[57]  Brian Scassellati,et al.  A Context-Dependent Attention System for a Social Robot , 1999, IJCAI.

[58]  Petra Wagner,et al.  Gaze Patterns in Turn-Taking , 2012, INTERSPEECH.

[59]  Jens Edlund,et al.  Taming Mona Lisa: Communicating gaze faithfully in 2D and 3D facial projections , 2012, TIIS.

[60]  S. Baron-Cohen,et al.  Gaze Perception Triggers Reflexive Visuospatial Orienting , 1999 .

[61]  Gamini Dissanayake,et al.  Nonverbal robot-group interaction using an imitated gaze cue , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[62]  Susan R. Fussell,et al.  Comparing a computer agent with a humanoid robot , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[63]  Bilge Mutlu,et al.  Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[64]  Hideaki Kuzuoka,et al.  Prior-to-request and request behaviors within elderly day care: Implications for developing service robots for use in multiparty settings , 2007, ECSCW.

[65]  Brian Scassellati,et al.  Robot gaze does not reflexively cue human attention , 2011, CogSci.

[66]  Siddhartha S. Srinivasa,et al.  Generating Legible Motion , 2013, Robotics: Science and Systems.

[67]  Wolff‐Michael Roth Gestures: Their Role in Teaching and Learning , 2001 .

[68]  Toshikazu Hasegawa,et al.  Reflexive orienting in response to eye gaze and an arrow in children with and without autism. , 2004, Journal of child psychology and psychiatry, and allied disciplines.

[69]  Brian Scassellati,et al.  Data-Driven Model of Nonverbal Behavior for Socially Assistive Human-Robot Interactions , 2014, ICMI.

[70]  Vijay Kumar,et al.  Robotic grasping and contact: a review , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[71]  Takayuki Kanda,et al.  Friendly social robot that understands human's friendly relationships , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[72]  Shaobo Huang,et al.  How to train your DragonBot: Socially assistive robots for teaching children about nutrition through play , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[73]  Ali Borji,et al.  State-of-the-Art in Visual Attention Modeling , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[74]  Rajesh P. N. Rao,et al.  "Social" robots are psychological agents for infants: A test of gaze following , 2010, Neural Networks.

[75]  Ross A. Knepper,et al.  Herb 2.0: Lessons Learned From Developing a Mobile Manipulator for the Home , 2012, Proceedings of the IEEE.

[76]  James C. Lester,et al.  Animated Pedagogical Agents: Face-to-Face Interaction in Interactive Learning Environments , 2000 .

[77]  Stefano Caselli,et al.  Comfortable robot to human object hand-over , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[78]  Ning Wang,et al.  Don't just stare at me! , 2010, CHI.

[79]  A. Ito,et al.  Why robots need body for mind communication - an attempt of eye-contact between human and robot , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[80]  C. Breazeal,et al.  Robotic Partners ’ Bodies and Minds : An Embodied Approach to Fluid Human-Robot Collaboration , 2006 .

[81]  Tetsuo Ono,et al.  Android as a telecommunication medium with a human-like presence , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[82]  Edward G. Freedman,et al.  Coordination of the eyes and head: movement kinematics , 2000, Experimental Brain Research.

[83]  Stefan Schaal,et al.  Overt visual attention for a humanoid robot , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[84]  Brian Scassellati,et al.  No fair!! An interaction with a cheating robot , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[85]  Yuichiro Yoshikawa,et al.  The effects of responsive eye movement and blinking behavior in a communication robot , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[86]  Brett Browning,et al.  A survey of robot learning from demonstration , 2009, Robotics Auton. Syst..

[87]  Katarzyna Chawarska,et al.  Looking But Not Seeing: Atypical Visual Scanning and Recognition of Faces in 2 and 4-Year-Old Children with Autism Spectrum Disorder , 2009, Journal of autism and developmental disorders.

[88]  Allison Sauppé,et al.  Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[89]  Cynthia Breazeal,et al.  Robotic learning companions for early language development , 2013, ICMI '13.

[90]  Matthew Stone,et al.  Speaking with hands: creating animated conversational characters from recordings of human performance , 2004, ACM Trans. Graph..

[91]  Elisabetta Bevacqua,et al.  A Model of Attention and Interest Using Gaze Behavior , 2005, IVA.

[92]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[93]  R. Desimone,et al.  Neural mechanisms of selective visual attention. , 1995, Annual review of neuroscience.

[94]  Alois Knoll,et al.  The roles of haptic-ostensive referring expressions in cooperative, task-based human-robot dialogue , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[95]  Hiroshi Ishiguro,et al.  Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[96]  Kristinn R. Thórisson,et al.  FACE-TO-FACE COMMUNICATION WITH COMPUTER AGENTS , 2000 .

[97]  S. Tipper,et al.  Sex differences in eye gaze and symbolic cueing of attention , 2005, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[98]  Roel Vertegaal,et al.  Explaining effects of eye gaze on mediated group conversations:: amount or synchronization? , 2002, CSCW '02.

[99]  Xia Mao,et al.  Emotional eye movement generation based on Geneva Emotion Wheel for virtual agents , 2012, J. Vis. Lang. Comput..

[100]  K. Yamazaki,et al.  Coordination of verbal and non-verbal actions in human―robot interaction at museums and exhibitions , 2010 .

[101]  Christoph Bartneck,et al.  Expressive robots in education: varying the degree of social supportive behavior of a robotic tutor , 2010, CHI.

[102]  Brian Scassellati,et al.  Shaping productive help-seeking behavior during robot-child tutoring interactions , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[103]  S. Baron-Cohen,et al.  Is there an innate gaze module? Evidence from human neonates , 2000 .

[104]  Justine Cassell,et al.  Embodied conversational interface agents , 2000, CACM.

[105]  Riitta Parkkola,et al.  Automatic attention orienting by social and symbolic cues activates different neural networks: An fMRI study , 2006, NeuroImage.

[106]  Kristinn R. Thórisson Gandalf: an embodied humanoid capable of real-time multimodal dialogue with people , 1997, AGENTS '97.

[107]  Frank Broz,et al.  Mutual gaze, personality, and familiarity: Dual eye-tracking during conversation , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[108]  Stefan Schaal,et al.  Robot Learning From Demonstration , 1997, ICML.

[109]  Katsushi Ikeuchi,et al.  Flexible cooperation between human and robot by interpreting human intention from gaze information , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[110]  David Schlangen,et al.  The Power of a Glance: Evaluating Embodiment and Turn-Tracking Strategies of an Active Robotic Overhearer , 2015, AAAI Spring Symposia.

[111]  Brian Scassellati,et al.  How to build robots that make friends and influence people , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[112]  Tony Belpaeme,et al.  Towards retro-projected robot faces: An alternative to mechatronic and android faces , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[113]  Wendy Ju,et al.  Expressing thought: Improving robot readability with animation principles , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[114]  Siddhartha S. Srinivasa,et al.  Using spatial and temporal contrast for fluent robot-human hand-overs , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[115]  Norman I. Badler,et al.  Where to Look? Automating Attending Behaviors of Virtual Human Characters , 1999, AGENTS '99.

[116]  Alan Kingstone,et al.  Brain Responses to Biological Relevance , 2008, Journal of Cognitive Neuroscience.

[117]  Matthew R. Walter,et al.  Approaching the Symbol Grounding Problem with Probabilistic Graphical Models , 2011, AI Mag..

[118]  J. Wolfe,et al.  Guided Search 2.0 A revised model of visual search , 1994, Psychonomic bulletin & review.

[119]  Catherine Pelachaud,et al.  Rules for Responsive Robots: Using Human Interactions to Build Virtual Interactions , 2002 .

[120]  C. N. Macrae,et al.  Are You Looking at Me? Eye Gaze and Person Perception , 2002, Psychological science.

[121]  A. Kingstone,et al.  Eyes are special but not for everyone: the case of autism. , 2005, Brain research. Cognitive brain research.

[122]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[123]  Michael Dorr,et al.  Large-Scale Optimization of Hierarchical Features for Saliency Prediction in Natural Images , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[124]  Sonya S. Kwak,et al.  Have you ever lied?: The impacts of gaze avoidance on people's perception of a robot , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[125]  Bilge Mutlu,et al.  Modeling and Evaluating Narrative Gestures for Humanlike Robots , 2013, Robotics: Science and Systems.

[126]  Vanessa Evers,et al.  What happens when a robot favors someone? How a tour guide robot uses gaze behavior to address multiple persons while storytelling about art , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[127]  C. Kleinke Gaze and eye contact: a research review. , 1986, Psychological bulletin.

[128]  Andrea Lockerd Thomaz,et al.  Tutelage and socially guided robot learning , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[129]  Tony Belpaeme,et al.  Head Pose Estimation is an Inadequate Replacement for Eye Gaze in Child-Robot Interaction , 2015, HRI.

[130]  Candace L. Sidner,et al.  Recognizing engagement in human-robot interaction , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[131]  Laurent Itti,et al.  Realistic avatar eye and head animation using a neurobiological model of visual attention , 2004, SPIE Optics + Photonics.

[132]  Siddhartha S. Srinivasa,et al.  Toward seamless human-robot handovers , 2013, Journal of Human-Robot Interaction.

[133]  Brian Scassellati,et al.  Are you looking at me? Perception of robot attention is mediated by gaze type and group size , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[134]  Brian Scassellati,et al.  Personalizing Robot Tutors to Individuals’ Learning Differences , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[135]  Robert O. Ambrose,et al.  Robonaut 2 - The first humanoid robot in space , 2011, 2011 IEEE International Conference on Robotics and Automation.

[136]  S. Brennan,et al.  Speakers' eye gaze disambiguates referring expressions early during face-to-face conversation , 2007 .

[137]  Takayuki Kanda,et al.  It's not polite to point Generating socially-appropriate deictic behaviors towards people , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[138]  Alan Kingstone,et al.  Taking control of reflexive social attention , 2005, Cognition.

[139]  B. Scassellati,et al.  Measuring context: The gaze patterns of children with autism evaluated from the bottom-up , 2007, 2007 IEEE 6th International Conference on Development and Learning.

[140]  Nikolaos Mavridis,et al.  A review of verbal and non-verbal human-robot interactive communication , 2014, Robotics Auton. Syst..

[141]  Sean Andrist,et al.  Look Like Me: Matching Robot Personality via Gaze to Increase Motivation , 2015, CHI.

[142]  Takayuki Kanda,et al.  Footing in human-robot conversations: How robots might shape participant roles using gaze cues , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[143]  Hideaki Kuzuoka,et al.  Museum guide robot based on sociological interaction analysis , 2007, CHI.

[144]  Alexandre Bernardino,et al.  Multimodal saliency-based bottom-up attention a framework for the humanoid robot iCub , 2008, 2008 IEEE International Conference on Robotics and Automation.

[145]  Max Q.-H. Meng,et al.  Impacts of Robot Head Gaze on Robot-to-Human Handovers , 2015, Int. J. Soc. Robotics.

[146]  Takayuki Kanda,et al.  Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[147]  Zheng Li,et al.  EEMML: the emotional eye movement animation toolkit , 2011, Multimedia Tools and Applications.

[148]  Brian Scassellati Mechanisms of Shared Attention for a Humanoid Robot , 1998 .

[149]  Gernot A. Fink,et al.  Focusing computational visual attention in multi-modal human-robot interaction , 2010, ICMI-MLMI '10.

[150]  Maja J. Mataric,et al.  Embodiment and Human-Robot Interaction: A Task-Based Perspective , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[151]  Brian Scassellati,et al.  Active Learning of Joint Attention , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[152]  Barbara Tversky,et al.  Communicative Gestures Facilitate Problem Solving for Both Communicators and Recipients , 2007, Model-Based Reasoning in Science, Technology, and Medicine.

[153]  Maja J. Mataric,et al.  Investigating the effects of visual saliency on deictic gesture production by a humanoid robot , 2011, 2011 RO-MAN.

[154]  Alan Kingstone,et al.  The eyes have it!: An fMRI investigation , 2004, Brain and Cognition.

[155]  Siddhartha S. Srinivasa,et al.  Legibility and predictability of robot motion , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[156]  Bilge Mutlu,et al.  Pay attention!: designing adaptive agents that monitor and improve user engagement , 2012, CHI.

[157]  S. Drucker,et al.  The Role of Eye Gaze in Avatar Mediated Conversational Interfaces , 2000 .

[158]  Hirotake Yamazoe,et al.  Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking , 2007, ICMI '07.

[159]  Ron Artstein,et al.  Towards building a virtual counselor: modeling nonverbal behavior during intimate self-disclosure , 2012, AAMAS.

[160]  Andrew Olney,et al.  Gaze tutor: A gaze-reactive intelligent tutoring system , 2012, Int. J. Hum. Comput. Stud..

[161]  Sang Ryong Kim,et al.  Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people's loneliness in human-robot interaction , 2006, Int. J. Hum. Comput. Stud..

[162]  Brian Scassellati,et al.  The Benefits of Interactions with Physically Present Robots over Video-Displayed Agents , 2011, Int. J. Soc. Robotics.

[163]  Tony Belpaeme,et al.  Comparing Robot Embodiments in a Guided Discovery Learning Interaction with Children , 2015, Int. J. Soc. Robotics.

[164]  Stefan Kopp,et al.  Generation and Evaluation of Communicative Robot Gesture , 2012, Int. J. Soc. Robotics.

[165]  Adriana Tapus,et al.  Motion-Oriented Attention for a Social Gaze Robot Behavior , 2014, ICSR.

[166]  Scott E. Hudson,et al.  Spatial and Other Social Engagement Cues in a Child-Robot Interaction: Effects of a Sidekick , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[167]  Siddhartha S. Srinivasa,et al.  Human preferences for robot-human hand-over configurations , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[168]  Christof Koch,et al.  Modeling attention to salient proto-objects , 2006, Neural Networks.

[169]  Takayuki Kanda,et al.  Pointing to space: Modeling of deictic interaction referring to regions , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[170]  Rachid Alami,et al.  Towards a Task-Aware Proactive Sociable Robot Based on Multi-state Perspective-Taking , 2013, Int. J. Soc. Robotics.

[171]  Ipke Wachsmuth,et al.  An Operational Model of Joint Attention - Timing of Gaze Patterns in Interactions between Humans and a Virtual Human , 2012, CogSci.

[172]  Gabriel Skantze,et al.  Furhat: A Back-Projected Human-Like Robot Head for Multiparty Human-Machine Interaction , 2011, COST 2102 Training School.

[173]  Minoru Asada,et al.  A constructive model for the development of joint attention , 2003, Connect. Sci..

[174]  Alan Kingstone,et al.  Attentional effects of counterpredictive gaze and arrow cues. , 2004, Journal of experimental psychology. Human perception and performance.

[175]  Anna-Lisa Vollmer,et al.  Robot feedback shapes the tutor’s presentation: How a robot’s online gaze strategies lead to micro-adaptation of the human’s conduct , 2013 .

[176]  Raymond H. Cuijpers,et al.  Turn-yielding cues in robot-human conversation , 2015 .

[177]  Takayuki Kanda,et al.  How to approach humans?-strategies for social robots to initiate interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[178]  Siddhartha S. Srinivasa,et al.  Learning the communication of intent prior to physical collaboration , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[179]  Siddhartha S. Srinivasa,et al.  CHOMP: Gradient optimization techniques for efficient motion planning , 2009, 2009 IEEE International Conference on Robotics and Automation.

[180]  Karon E. MacLean,et al.  Meet Me where I’m Gazing: How Shared Attention Gaze Affects Human-Robot Handover Timing , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[181]  Candace L. Sidner,et al.  Generating connection events for human-robot collaboration , 2011, 2011 RO-MAN.

[182]  T. Allison,et al.  Brain activation evoked by perception of gaze shifts: the influence of context , 2003, Neuropsychologia.

[183]  Henrik I. Christensen,et al.  Computational visual attention systems and their cognitive foundations: A survey , 2010, TAP.

[184]  Sean Andrist,et al.  Look together: analyzing gaze coordination with epistemic network analysis , 2015, Front. Psychol..

[185]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.

[186]  S. Goldin-Meadow,et al.  The role of gesture in communication and thinking , 1999, Trends in Cognitive Sciences.

[187]  Robin R. Murphy,et al.  A survey of social gaze , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[188]  Jason Tipples,et al.  Orienting to counterpredictive gaze and arrow cues , 2008, Perception & psychophysics.

[189]  Catherine Pelachaud,et al.  Eye Communication in a Conversational 3D Synthetic Agent , 2000, AI Commun..

[190]  Norihiro Hagita,et al.  Messages embedded in gaze of interface agents --- impression management with agent's gaze , 2002, CHI.

[191]  Illah R. Nourbakhsh,et al.  The role of expressiveness and attention in human-robot interaction , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[192]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[193]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[194]  Laurent Itti,et al.  Photorealistic Attention-Based Gaze Animation , 2006, 2006 IEEE International Conference on Multimedia and Expo.

[195]  A. Tapus The Grand Challenges in Helping Humans Through Social Interaction , 2007 .

[196]  C. Teuscher,et al.  Gaze following: why (not) learn it? , 2006, Developmental science.

[197]  Cynthia Breazeal,et al.  Engaging robots: easing complex human-robot teamwork using backchanneling , 2013, CSCW.

[198]  D. Povinelli,et al.  Mindblindness. An Essay on Autism and Theory of Mind Simon Baron-Cohen 1995 , 1996, Trends in Neurosciences.

[199]  M. V. von Grünau,et al.  The Detection of Gaze Direction: A Stare-In-The-Crowd Effect , 1995, Perception.

[200]  H. Nguyen El-E: An Assistive Robot that Fetches Objects from Flat Surfaces , 2008 .

[201]  Norman I. Badler,et al.  Visual Attention and Eye Gaze During Multiparty Conversations with Distractions , 2006, IVA.

[202]  Norman I. Badler,et al.  Evaluating perceived trust from procedurally animated gaze , 2013, MIG.

[203]  A. Meltzoff,et al.  The importance of eyes: how infants interpret adult looking behavior. , 2002, Developmental psychology.

[204]  Bilge Mutlu,et al.  A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[205]  Charles C. Kemp,et al.  Human-Robot Interaction for Cooperative Manipulation: Handing Objects to One Another , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[206]  Eric Horvitz,et al.  Facilitating multiparty dialog with gaze, gesture, and speech , 2010, ICMI-MLMI '10.

[207]  Francisco José Madrid-Cuevas,et al.  Automatic generation and detection of highly reliable fiducial markers under occlusion , 2014, Pattern Recognit..

[208]  Brian Scassellati,et al.  Robot Gaze Is Different From Human Gaze: Evidence that robot gaze does not cue reflexive attention , 2012 .

[209]  Takanori Shibata,et al.  Living With Seal Robots—Its Sociopsychological and Physiological Influences on the Elderly at a Care House , 2007, IEEE Transactions on Robotics.

[210]  Takayuki Kanda,et al.  Providing route directions: Design of robot's utterance, gesture, and timing , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[211]  Cynthia Breazeal,et al.  An Empirical Analysis of Team Coordination Behaviors and Action Planning With Application to Human–Robot Teaming , 2010, Hum. Factors.

[212]  Sean Andrist,et al.  Designing effective gaze mechanisms for virtual agents , 2012, CHI.

[213]  Stefan Kopp,et al.  MODELING THE PRODUCTION OF COVERBAL ICONIC GESTURES BY LEARNING BAYESIAN DECISION NETWORKS , 2010, Appl. Artif. Intell..

[214]  Brian Scassellati,et al.  Speech and Gaze Conflicts in Collaborative Human-Robot Interactions , 2014, CogSci.

[215]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[216]  N. Emery,et al.  The eyes have it: the neuroethology, function and evolution of social gaze , 2000, Neuroscience & Biobehavioral Reviews.

[217]  Pengcheng Luo,et al.  Synchronized gesture and speech production for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[218]  Anthony G. Pipe,et al.  Joint action understanding improves robot-to-human object handover , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[219]  Elizabeth A. Croft,et al.  Grip forces and load forces in handovers: Implications for designing human-robot handover controllers , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[220]  A. Bangerter,et al.  Using Pointing and Describing to Achieve Joint Focus of Attention in Dialogue , 2004, Psychological science.

[221]  Kerstin Dautenhahn,et al.  Therapeutic and educational objectives in robot assisted play for children with autism , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[222]  Alois Knoll,et al.  Interacting in time and space: Investigating human-human and human-robot joint action , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[223]  Raj M. Ratwani,et al.  Integrating vision and audition within a cognitive architecture to track conversations , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[224]  Luca Turella,et al.  When Gaze Turns into Grasp , 2006, Journal of Cognitive Neuroscience.

[225]  Mel Slater,et al.  The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment , 2003, CHI '03.

[226]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[227]  Alois Knoll,et al.  Human-robot interaction in handing-over tasks , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[228]  Sean Andrist,et al.  Conversational Gaze Aversion for Virtual Agents , 2013, IVA.

[229]  P. Ravindra De Silva,et al.  Therapeutic-assisted robot for children with autism , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[230]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[231]  Brian Scassellati,et al.  The Physical Presence of a Robot Tutor Increases Cognitive Learning Gains , 2012, CogSci.

[232]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[233]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[234]  Zenzi M. Griffin,et al.  PSYCHOLOGICAL SCIENCE Research Article WHAT THE EYES SAY ABOUT SPEAKING , 2022 .

[235]  Robin R. Murphy,et al.  Inferring social gaze from conversational structure and timing , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[236]  Yuichiro Yoshikawa,et al.  Responsive Robot Gaze to Interaction Partner , 2006, Robotics: Science and Systems.

[237]  Siddhartha S. Srinivasa,et al.  Predictability or adaptivity? Designing robot handoffs modeled from trained dogs and people , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[238]  Matthias Scheutz,et al.  Adaptive eye gaze patterns in interactions with human and artificial agents , 2012, TIIS.

[239]  Candace L. Sidner,et al.  Where to look: a study of human-robot engagement , 2004, IUI '04.

[240]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[241]  Chrystopher L. Nehaniv,et al.  Title of paper : KASPAR – A Minimally Expressive Humanoid Robot for Human-Robot Interaction Research , 2009 .

[242]  M. Tomasello,et al.  Joint attention and early language. , 1986, Child development.

[243]  M. Crocker,et al.  Investigating joint attention mechanisms through spoken human–robot interaction , 2011, Cognition.

[244]  Brian Scassellati,et al.  How Social Robots Will Help Us to Diagnose, Treat, and Understand Autism , 2005, ISRR.

[245]  Nadim Joni Shah,et al.  Duration matters: Dissociating neural correlates of detection and evaluation of social gaze , 2009, NeuroImage.

[246]  Chen Yu,et al.  Cooperative gazing behaviors in human multi-robot interaction , 2013 .

[247]  A. Kingstone,et al.  The eyes have it! Reflexive orienting is triggered by nonpredictive gaze , 1998 .

[248]  Andrea Lockerd Thomaz,et al.  Effects of responding to, initiating and ensuring joint attention in human-robot interaction , 2011, 2011 RO-MAN.

[249]  T. Kanda,et al.  Robot mediated round table: Analysis of the effect of robot's gaze , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[250]  Rainer Stiefelhagen,et al.  Visual recognition of pointing gestures for human-robot interaction , 2007, Image Vis. Comput..

[251]  P. Downing,et al.  Why does the gaze of others direct visual attention? , 2004 .