Robot Nonverbal Communication as an AI Problem (and Solution)

In typical human interactions, nonverbal behaviors such as eye gazes and gestures serve to augment and reinforce spoken communication. To use similar nonverbal behaviors in human-robot interactions, researchers can apply artificial intelligence techniques such as machine learning, cognitive modeling, and computer vision. But knowledge of nonverbal behavior can also benefit artificial intelligence: because nonverbal communication can reveal human mental states, these behaviors provide additional input to artificial intelligence problems such as learning from demonstration, natural language processing, and motion planning. This article describes how nonverbal communication in HRI can benefit from AI techniques as well as how AI problems can use nonverbal communication in their solutions.

[1]  Ning Wang,et al.  Don't just stare at me! , 2010, CHI.

[2]  Stefan Schaal,et al.  Robot Learning From Demonstration , 1997, ICML.

[3]  Sonya S. Kwak,et al.  Have you ever lied?: The impacts of gaze avoidance on people's perception of a robot , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[4]  Takayuki Kanda,et al.  Conversational gaze mechanisms for humanlike robots , 2012, TIIS.

[5]  Siddhartha S. Srinivasa,et al.  Legibility and predictability of robot motion , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  Bilge Mutlu,et al.  Pay attention!: designing adaptive agents that monitor and improve user engagement , 2012, CHI.

[7]  Minoru Asada,et al.  A constructive model for the development of joint attention , 2003, Connect. Sci..

[8]  Anna-Lisa Vollmer,et al.  Robot feedback shapes the tutor’s presentation: How a robot’s online gaze strategies lead to micro-adaptation of the human’s conduct , 2013 .

[9]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.

[10]  S. Goldin-Meadow,et al.  The role of gesture in communication and thinking , 1999, Trends in Cognitive Sciences.

[11]  M L Abercrombie,et al.  Non-verbal communication. , 1972, Proceedings of the Royal Society of Medicine.

[12]  Siddhartha S. Srinivasa,et al.  Toward seamless human-robot handovers , 2013, Journal of Human-Robot Interaction.

[13]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[14]  Andrea Lockerd Thomaz,et al.  Effects of responding to, initiating and ensuring joint attention in human-robot interaction , 2011, 2011 RO-MAN.

[15]  Katsushi Ikeuchi,et al.  Flexible cooperation between human and robot by interpreting human intention from gaze information , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[16]  Peter Ford Dominey,et al.  I Reach Faster When I See You Look: Gaze Effects in Human–Human and Human–Robot Face-to-Face Cooperation , 2012, Front. Neurorobot..

[17]  Raj M. Ratwani,et al.  Integrating vision and audition within a cognitive architecture to track conversations , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  Brett Browning,et al.  A survey of robot learning from demonstration , 2009, Robotics Auton. Syst..

[20]  Andrea Lockerd Thomaz,et al.  Tutelage and socially guided robot learning , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[21]  Bilge Mutlu,et al.  Learning-Based Modeling of Multimodal Behaviors for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[22]  Zenzi M. Griffin,et al.  PSYCHOLOGICAL SCIENCE Research Article WHAT THE EYES SAY ABOUT SPEAKING , 2022 .

[23]  Brian Scassellati,et al.  How to build robots that make friends and influence people , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[24]  Bilge Mutlu,et al.  The repertoire of robot behavior , 2013, J. Hum. Robot Interact..

[25]  Laurent Itti,et al.  Realistic avatar eye and head animation using a neurobiological model of visual attention , 2004, SPIE Optics + Photonics.

[26]  Christoph Bartneck,et al.  Expressive robots in education: varying the degree of social supportive behavior of a robotic tutor , 2010, CHI.

[27]  Hiroshi Ishiguro,et al.  Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[28]  Matthew W. Hoffman,et al.  A probabilistic model of gaze imitation and shared attention , 2006, Neural Networks.

[29]  Siddhartha S. Srinivasa,et al.  Deliberate Delays During Robot-to-Human Handovers Improve Compliance With Gaze Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[30]  Gamini Dissanayake,et al.  Nonverbal robot-group interaction using an imitated gaze cue , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[31]  Sean Andrist,et al.  Conversational Gaze Aversion for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[32]  Brent Lance,et al.  The Rickel Gaze Model: A Window on the Mind of a Virtual Human , 2007, IVA.

[33]  Matthew R. Walter,et al.  Understanding Natural Language Commands for Robotic Navigation and Mobile Manipulation , 2011, AAAI.

[34]  Max Q.-H. Meng,et al.  Impacts of Robot Head Gaze on Robot-to-Human Handovers , 2015, Int. J. Soc. Robotics.