Multi-Sensory Urban Search-and-Rescue Robotics: Improving the Operator’s Omni-Directional Perception

The area of Human-Robot Interaction deals with problems not only related to robots interacting with humans, but also with problems related to humans interacting and controlling robots. This article focuses on the latter and evaluates multi-sensory (vision, hearing, touch, smell) feedback interfaces as a means to improve robot-operator cognition and performance. The paper summarizes three-previously reported empirical studies on multi-sensory feedback using simulated robots. It also reports the results of a new study that used a physical robot to validate the results of these previous abovementioned studies, and evaluate the merits and flaws of a multi-sensory interface as its sensorial complexity was gradually increased. The human senses were selected based on their response time to feedback and easiness of adaptability of their feedback mechanisms to different types of robot-sensed data. The results show that, if well-designed, multi-sensory feedback interfaces can indeed improve the robot operator data perception and performance. They shed some light on the benefits and challenges multi-sensory feedback interfaces bring, specifically on teleoperated robotics and urban search-and-rescue. It adds to our current understanding of these kinds of interfaces and provides a few insights to assist the continuation of research in the area.

[1]  Katherine M. Tsui,et al.  Design and validation of two-handed multi-touch tabletop controllers for robot teleoperation , 2011, IUI '11.

[2]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[3]  Robert W. Lindeman,et al.  Poster: Comparing vibro-tactile feedback modes for collision proximity feedback in USAR virtual robot teleoperation , 2012, 2012 IEEE Symposium on 3D User Interfaces (3DUI).

[4]  Rachel S. Herz,et al.  Aromatherapy Facts and Fictions: A Scientific Analysis of Olfactory Effects on Mood, Physiology and Behavior , 2009, The International journal of neuroscience.

[5]  Paulo de Barros,et al.  Evaluation of Multi-sensory Feedback in Virtual and Real Remote Environments in a USAR Robot Teleoperation Scenario , 2014 .

[6]  Hiroo Iwata,et al.  Food simulator: a haptic interface for biting , 2004, IEEE Virtual Reality 2004.

[7]  David B. Kaber,et al.  Investigation of multi-modal interface features for adaptive automation of a human-robot system , 2006, Int. J. Hum. Comput. Stud..

[8]  Holly A. Yanco,et al.  Evolving interface design for robot search tasks , 2007, J. Field Robotics.

[9]  J. V. Erp,et al.  Vibrotactile in-vehicle navigation system , 2004 .

[10]  Robert S. Kennedy,et al.  Simulator Sickness Questionnaire: An enhanced method for quantifying simulator sickness. , 1993 .

[11]  Ken-ichi Okada,et al.  Scent Presentation Expressing Two Smells of Different Intensity Simultaneously , 2009, EGVE/ICAT/EuroVR.

[12]  Robert W. Lindeman Virtual Contact: The Continuum from Purely Visual to Purely Physical , 2003 .

[13]  Norman I. Badler,et al.  Collision Awareness Using Vibrotactile Arrays , 2007, 2007 IEEE Virtual Reality Conference.

[14]  Tapio Lokki,et al.  Comparison of auditory, visual, and audiovisual navigation in a 3D space , 2005, TAP.

[15]  Matthew O. Ward,et al.  Enhancing robot teleoperator situation awareness and performance using vibro-tactile and graphical feedback , 2011, 2011 IEEE Symposium on 3D User Interfaces (3DUI).

[16]  Brian A. Weiss,et al.  Test arenas and performance metrics for urban search and rescue robots , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[17]  Akio Yamamoto,et al.  Straw-like user interface: virtual experience of the sensation of drinking using a straw , 2006, ACE '06.

[18]  Christopher D. Wickens,et al.  Multiple resources and performance prediction , 2002 .

[19]  Michael J. Singer,et al.  Measuring Presence in Virtual Environments: A Presence Questionnaire , 1998, Presence.

[20]  Jacek Gwizdka Using stroop task to assess cognitive load , 2010, ECCE.

[21]  Kazuhiko Kawamura,et al.  Evaluation of an enhanced human-robot interface , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[22]  Takuji Narumi,et al.  Augmented reality flavors: gustatory display based on edible marker and cross-modal interaction , 2011, CHI.

[23]  Robert W. Lindeman,et al.  Effectiveness of directional vibrotactile cuing on a building-clearing task , 2005, CHI.

[24]  Daniel C. Asmar,et al.  A robot's spatial perception communicated via human touch , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[25]  Jean Scholtz,et al.  Beyond usability evaluation: analysis of human-robot interaction at a major robotics competition , 2004 .

[26]  Shinjiro Kawato,et al.  Projection based olfactory display with nose tracking , 2004, IEEE Virtual Reality 2004.

[27]  Mark Billinghurst,et al.  The use of sketch maps to measure cognitive maps of virtual environments , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[28]  Keith Wesnes,et al.  AROMAS OF ROSEMARY AND LAVENDER ESSENTIAL OILS DIFFERENTIALLY AFFECT COGNITION AND MOOD IN HEALTHY ADULTS , 2003, The International journal of neuroscience.

[29]  Mary L. Cummings,et al.  Audio Decision Support for Supervisory Control of Unmanned Vehicles Literature Review , 2006 .

[30]  Kristopher J. Blom,et al.  Virtual collision notification , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[31]  Robert W. Lindeman,et al.  Performance effects of multi-sensory displays in virtual teleoperation environments , 2013, SUI '13.

[32]  Mel Slater,et al.  Using Presence Questionnaires in Reality , 2000, Presence: Teleoperators & Virtual Environments.

[33]  Wolfram Burgard,et al.  Autonomous exploration and mapping of abandoned mines , 2004, IEEE Robotics & Automation Magazine.

[34]  Michael A. Goodrich,et al.  Ecological Interfaces for Improving Mobile Robot Teleoperation , 2007, IEEE Transactions on Robotics.

[35]  Geoffrey Ho,et al.  Proposed Techniques for Extending Ecological Interface Design to Tactile Displays: Using Tactile Cues to Enhance UAV Interface Design , 2009 .

[36]  Robin R. Murphy,et al.  Using the Kinect for search and rescue robotics , 2012, 2012 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR).

[37]  Masatoshi Ishikawa,et al.  Augmenting spatial awareness with Haptic Radar , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[38]  Linda R. Elliott,et al.  Overview of Meta-analyses Investigating Vibrotactile versus Visual Display Options , 2009, HCI.

[39]  Mica R. Endsley,et al.  Theoretical Underpinnings of Situation Awareness, A Critical Review , 2000 .