Expressive Lights for Revealing Mobile Service Robot State

Autonomous mobile service robots move in our buildings, carrying out different tasks across multiple floors. While moving and performing their tasks, these robots find themselves in a variety of states. Although speech is often used for communicating the robot’s state to humans, such communication can often be ineffective. We investigate the use of lights as a persistent visualization of the robot’s state in relation to both tasks and environmental factors. Programmable lights offer a large degree of choices in terms of animation pattern, color and speed. We present this space of choices and introduce different animation profiles that we consider to animate a set of programmable lights on the robot. We conduct experiments to query about suitable animations for three representative scenarios of our autonomous symbiotic robot, CoBot. Our work enables CoBot to make its state persistently visible to humans.

[1]  Seiji Yamada,et al.  Blinking light patterns as artificial subtle expressions in human-robot speech interaction , 2011, 2011 RO-MAN.

[2]  James Everett Young,et al.  The Roomba mood ring: An ambient-display robot , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  Stephanie Rosenthal,et al.  CoBots: Collaborative robots servicing multi-floor buildings , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Seiji Yamada,et al.  Smoothing human-robot speech interactions by using a blinking-light as subtle expression , 2008, ICMI '08.

[5]  Myung Jin Chung,et al.  Determining color and blinking to support facial expression of a robot for conveying emotional intensity , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[6]  Ana Paiva,et al.  Expression of Emotions in Virtual Humans Using Lights, Shadows, Composition and Filters , 2007, ACII.

[7]  Bilge Mutlu,et al.  Communicating Directionality in Flying Robots , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  Yongsoon Choi,et al.  The considerable elements of the emotion expression using lights in apparel types , 2007, Mobility '07.

[9]  Cindy L. Bethel,et al.  Robots without faces: non-verbal social human-robot interaction , 2009 .

[10]  Ana Paiva,et al.  Multimodal Affect Modeling and Recognition for Empathic Robot Companions , 2013, Int. J. Humanoid Robotics.

[11]  Brad A. Myers,et al.  The importance of percent-done progress indicators for computer-human interfaces , 1985, CHI '85.

[12]  Alberto Betella,et al.  Non-anthropomorphic Expression of Affective States through Parametrized Abstract Motifs , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[13]  Chris Harrison,et al.  Unlocking the expressivity of point lights , 2012, CHI.

[14]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..