Robot Communication Via Motion: Closing the Underwater Human-Robot Interaction Loop

In this paper, we propose a novel method for underwater robot-to-human communication using the motion of the robot as “body language”. To evaluate this system, we develop simulated examples of the system’s body language gestures, called kinemes, and compare them to a baseline system using flashing colored lights through a user study. Our work shows evidence that motion can be used as a successful communication vector which is accurate, easy to learn, and quick enough to be used, all without requiring any additional hardware to be added to our platform. We thus contribute to “closing the loop” for human-robot interaction underwater by proposing and testing this system, suggesting a library of possible body language gestures for underwater robots, and offering insight on the design of nonverbal robot-to-human communication methods.

[1]  Cesar Lucho,et al.  Daedalus: A sUAV for Human-Robot Interaction , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  Tetsuo Sawaragi,et al.  Acquiring communicative motor acts of social robot using interactive evolutionary computation , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[3]  Huimin Lu,et al.  Underwater image descattering and quality assessment , 2016, 2016 IEEE International Conference on Image Processing (ICIP).

[4]  Md Jahidul Islam,et al.  Dynamic Reconfiguration of Mission Parameters in Underwater Human-Robot Collaboration , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Tobias Doernbach,et al.  Robust Gesture-Based Communication for Underwater Human-Robot Interaction in the context of Search and Rescue Diver Missions , 2018, ArXiv.

[6]  Md Jahidul Islam,et al.  Person-following by autonomous robots: A categorical overview , 2018, Int. J. Robotics Res..

[7]  Haiyong Zheng,et al.  Underwater image sharpness assessment based on selective attenuation of color in the water , 2016, OCEANS 2016 - Shanghai.

[8]  Jekaterina Novikova,et al.  A design model of emotional body expressions in non-humanoid robots , 2014, HAI.

[9]  Takayuki Kanda,et al.  How to approach humans?-strategies for social robots to initiate interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Reid Simmons,et al.  Expressive motion with x, y and theta: Laban Effort Features for mobile robots , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[11]  W. Nöth Handbook of Semiotics , 2001 .

[12]  Cindy L. Bethel,et al.  Robots without faces: non-verbal social human-robot interaction , 2009 .

[13]  Gregory Dudek,et al.  Fourier tags: Smoothly degradable fiducial markers for use in human-robot interaction , 2007, Fourth Canadian Conference on Computer and Robot Vision (CRV '07).

[14]  Jun Rekimoto,et al.  Swimoid: Interacting with an underwater buddy robot , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Md Jahidul Islam,et al.  Enhancing Underwater Imagery Using Generative Adversarial Networks , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[16]  Arturo Gomez Chavez,et al.  Stereo-vision based diver pose estimation using LSTM recurrent neural networks for AUV navigation guidance , 2017, OCEANS 2017 - Aberdeen.

[17]  Ayanna M. Howard,et al.  Sonar-Based Detection and Tracking of a Diver for Underwater Human-Robot Interaction Scenarios , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.

[18]  T. Hettmansperger,et al.  Robust Nonparametric Statistical Methods , 1998 .

[19]  Maja J. Mataric,et al.  The role of physical embodiment in human-robot interaction , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[20]  A STUDY OF LIGHT SIGNALS IN AVIATION AND NAVIGATION , 1931 .

[21]  Manuela M. Veloso,et al.  Mobile Service Robot State Revealing Through Expressive Lights: Formalism, Design, and Evaluation , 2018, Int. J. Soc. Robotics.

[22]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[23]  Atsushi Yamashita,et al.  Color Registration of Underwater Images for Underwater Sensing with Consideration of Light Attenuation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[24]  Michael R. M. Jenkin,et al.  Swimming with robots: Human robot communication at depth , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[25]  Taro Aoki,et al.  Development of high-speed data transmission equipment for the full-depth remotely operated vehicle-KAIKO" , 1997, Oceans '97. MTS/IEEE Conference Proceedings.

[26]  James J. Little,et al.  Ensuring safety in human-robot dialog — A cost-directed approach , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[27]  Gregory Dudek,et al.  A Visual Language for Robot Control and Programming: A Human-Interface Study , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[28]  Gregory Dudek,et al.  Enabling autonomous capabilities in underwater robotics , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[29]  Ronald C. Arkin,et al.  Human perspective on affective robotic behavior: a longitudinal study , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[30]  Ayanna M. Howard,et al.  Underwater human-robot communication: A case study with human divers , 2014, 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[31]  Antonio Fernández-Caballero,et al.  Experimentation on Emotion Regulation with Single-Colored Images , 2015, IWAAL.

[32]  E. Zereik,et al.  A Novel Gesture-Based Language for Underwater Human–Robot Interaction , 2018, Journal of Marine Science and Engineering.

[33]  Ana Paiva,et al.  The illusion of robotic life: Principles and practices of animation for robots , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[34]  Andrew Hogue,et al.  AQUA: An Amphibious Autonomous Robot , 2007, Computer.

[35]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).