Using nonverbal signals to request help during human-robot collaboration

Non-humanoid robots are becoming increasingly utilized for collaborative tasks that rely on each collaborator's ability to effectively convey their mental state while accurately estimating and interpreting their partner's knowledge, intent, and actions. During these tasks, it may be beneficial or even necessary for the human collaborator to assist the robot. Consequently, we explore the use of nonverbal signals to request help during a collaborative task. We focus on light and sound as they are commonly used communication channels across many domains. This paper analyzes the effectiveness of three nonverbal help signals that vary in urgency. Our results show that these signals significantly influence the human collaborator's and their perception of the collaboration.

[1]  Leila Takayama,et al.  Initiating interactions in order to get help: Effects of social framing on people's responses to robots' requests for assistance , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[2]  J. Weisenberger Fundamentals of Hearing: An Introduction (3rd ed.) , 1994 .

[3]  H. Huttenrauch,et al.  To help or not to help a service robot , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[4]  Bilge Mutlu,et al.  Coordination Mechanisms in Human-Robot Collaboration , 2013 .

[5]  T. Kanda,et al.  Six-and-a-half-month-old children positively attribute goals to human action and to humanoid-robot motion , 2005 .

[6]  Andrea Lockerd Thomaz,et al.  Working collaboratively with humanoid robots , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..

[7]  Christoph Bartneck,et al.  Perception of affect elicited by robot motion , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  Cristy Ho,et al.  Multisensory In-Car Warning Signals for Collision Avoidance , 2007, Hum. Factors.

[9]  Paul Barach,et al.  A New Paradigm for the Design of Audible Alarms that Convey Urgency Information , 2007, Journal of Clinical Monitoring and Computing.

[10]  Jessika Eichel,et al.  FUNDAMENTALS OF HEARING: AN INTRODUCTION , 1978, The Ulster Medical Journal.

[11]  Johan Fagerlönn,et al.  An Auditory Display to Convey Urgency Information in Industrial Control Rooms , 2014, HCI.

[12]  Stephanie Rosenthal,et al.  Mobile Robot Planning to Seek Help with Spatially-Situated Tasks , 2012, AAAI.

[13]  K. Nakayama,et al.  Priming of pop-out: I. Role of features , 1994, Memory & cognition.

[14]  Ehud Sharlin,et al.  Exploring minimal nonverbal interruption in HRI , 2011, 2011 RO-MAN.

[15]  Milind Tambe,et al.  Coordinating Social Communication in Human-Robot Task Collaborations , 2014 .

[16]  Judy Edworthy,et al.  Medical audible alarms: a review , 2013, J. Am. Medical Informatics Assoc..

[17]  Vijay Kumar,et al.  Dynamic role assignment for cooperative robots , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[18]  N. Sebanz,et al.  Psychological research on joint action: Theory and data , 2011 .

[19]  Alan H. S. Chan,et al.  Perceptions of implied hazard for visual and auditory alerting signals , 2009 .

[20]  Cynthia Breazeal,et al.  An Empirical Analysis of Team Coordination Behaviors and Action Planning With Application to Human–Robot Teaming , 2010, Hum. Factors.

[21]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[22]  Rachid Alami,et al.  How may I serve you?: a robot companion approaching a seated person in a helping context , 2006, HRI '06.

[23]  Martin Buss,et al.  Human-Robot Collaboration: a Survey , 2008, Int. J. Humanoid Robotics.

[24]  P. Todd,et al.  Accurate judgments of intention from motion cues alone: A cross-cultural study , 2005 .

[25]  Cynthia Breazeal,et al.  Collaboration in Human-Robot Teams , 2004, AIAA 1st Intelligent Systems Technical Conference.

[26]  Cynthia Breazeal,et al.  Cost-Based Anticipatory Action Selection for Human–Robot Fluency , 2007, IEEE Transactions on Robotics.

[27]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[28]  Siddhartha S. Srinivasa,et al.  Legibility and predictability of robot motion , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[29]  Bilge Mutlu,et al.  Communication of Intent in Assistive Free Flyers , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).