Passive Demonstrations of Light-Based Robot Signals for Improved Human Interpretability

When mobile robots navigate crowded, human-populated environments, the potential for conflict arises in the form of intersecting trajectories. This study investigates the use of light-emitting diodes (LEDs) arranged along the chassis of a robot in an arrangement similar to a turn signal on a car as a non-anthropomorphic, yet familiar signal to convey the intended path of a mobile service robot. We study the scenario of a human and a robot heading directly toward each other in a hallway, which may give rise to the familiar human experience in which both parties step to the right, then the left, then the right, continuing to block each other's paths until they are able to coordinate their movements and pass each other. We conducted a pilot study which revealed that people do not always interpret this signal as one may expect, which would be similar to how a car uses its turn signal. This motivated a 2 × 2 experiment in which the robot either does or does not use LEDs to indicate its intended direction of travel, and in which study participants either are able to or unable to witness the robot's “lane-changing” behavior further down the hallway prior to coming into direct proximal contact with the robot. The results demonstrate that exposing participants to the robot's use of the LED signal only once prior to passing each other in the hallway is sufficient to disambiguate its meaning to the user, and thus greatly enhances its utility in-situ, with no direct instruction or training to the user. These findings suggest a paradigm of passive demonstration of such signals in future applications.

[1]  Irving Langmuir,et al.  A Study of Light Signals in Aviation and Navigation , 1943 .

[2]  J. E. Lloyd Bioluminescence and Communication in Insects , 1983 .

[3]  Myung Jin Chung,et al.  Determining color and blinking to support facial expression of a robot for conveying emotional intensity , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[4]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[5]  Seiji Yamada,et al.  Smoothing human-robot speech interactions by using a blinking-light as subtle expression , 2008, ICMI '08.

[6]  Alexander Verl,et al.  Care-O-bot® 3 - creating a product vision for service robot applications by integrating design and technology , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[8]  Scott Kuindersma,et al.  Dexterous mobility with the uBot-5 mobile manipulator , 2009, 2009 International Conference on Advanced Robotics.

[9]  Cindy L. Bethel,et al.  Robots without faces: non-verbal social human-robot interaction , 2009 .

[10]  S Cousins,et al.  ROS on the PR2 [ROS Topics] , 2010 .

[11]  Seiji Yamada,et al.  Blinking light patterns as artificial subtle expressions in human-robot speech interaction , 2011, 2011 RO-MAN.

[12]  Chris Harrison,et al.  Unlocking the expressivity of point lights , 2012, CHI.

[13]  Ross A. Knepper,et al.  Herb 2.0: Lessons Learned From Developing a Mobile Manipulator for the Home , 2012, Proceedings of the IEEE.

[14]  Fuminori Saito,et al.  A field study of the human support robot in the home environment , 2013, 2013 IEEE Workshop on Advanced Robotics and its Social Impacts.

[15]  Joelle Pineau,et al.  Person tracking and following with 2D laser scanners , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[16]  Stephanie Rosenthal,et al.  CoBots: Robust Symbiotic Autonomous Mobile Service Robots , 2015, IJCAI.

[17]  Bilge Mutlu,et al.  Communicating Directionality in Flying Robots , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Aaron D. Ames,et al.  Valkyrie: NASA's First Bipedal Humanoid Robot , 2015, J. Field Robotics.

[19]  Kim Baraka Effective Non-Verbal Communication for Mobile Robots using Expressive Lights , 2016 .

[20]  Jake K. Aggarwal,et al.  BWIBots: A platform for bridging the gap between AI and human–robot interaction research , 2017, Int. J. Robotics Res..

[21]  Manuela M. Veloso,et al.  Mobile Service Robot State Revealing Through Expressive Lights: Formalism, Design, and Evaluation , 2018, Int. J. Soc. Robotics.

[22]  Maja J. Matarić,et al.  A Survey of Nonverbal Signaling Methods for Non-Humanoid Robots , 2018, Found. Trends Robotics.