Multisensory Wearable Interface for Immersion and Telepresence in Robotics

The idea of being present in a remote location has inspired researchers to develop robotic devices, that make humans to experience the feeling of telepresence. These devices need of multiple sensory feedback to provide a more realistic telepresence experience. In this paper, we develop a wearable interface for immersion and telepresence that provides to human with the capability of both to receive multisensory feedback from vision, touch, and audio, and to remotely control a robot platform. Multimodal feedback from a remote environment is based on the integration of sensor technologies coupled to the sensory system of the robot platform. Remote control of the robot is achieved by a modularised architecture, which allows to visually exploring the remote environment. We validated our paper with multiple experiments where participants, located at different venues, were able to successfully control the robot platform while visually exploring, touching, and listening a remote environment. In our experiments, we used two different robotic platforms: 1) the iCub humanoid robot and 2) the Pioneer LX mobile robot. These experiments show that our wearable interface is comfortable, easy to use, and adaptable to different robotic platforms. Furthermore, we observed that our approach allows humans to experience a vivid feeling of being present in a remote environment.

[1]  Peter Ford Dominey,et al.  The Coordinating Role of Language in Real-Time Multimodal Learning of Cooperative Tasks , 2013, IEEE Transactions on Autonomous Mental Development.

[2]  Guillaume Gibert,et al.  What makes human so different ? Analysis of human-humanoid robot interaction with a super Wizard of Oz platform , 2013 .

[3]  Senem Velipasalar,et al.  A Survey on Activity Detection and Classification Using Wearable Sensors , 2017, IEEE Sensors Journal.

[4]  Mark Billinghurst,et al.  Wearable Devices: New Ways to Manage Information , 1999, Computer.

[5]  Cynthia Breazeal,et al.  MeBot: a robotic platform for socially embodied presence , 2010, HRI.

[6]  Neil D. Lawrence,et al.  An integrated probabilistic framework for robot perception, learning and memory , 2016, 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[7]  Sieu K. Khuu,et al.  The Oculus Rift: a cost-effective tool for studying visual-vestibular interactions in self-motion perception , 2015, Front. Psychol..

[8]  Bin Wang,et al.  A robot arm/hand teleoperation system with telepresence and shared control , 2005, Proceedings, 2005 IEEE/ASME International Conference on Advanced Intelligent Mechatronics..

[9]  Yasuyoshi Yokokohji,et al.  Design and Evaluation of a Telepresence Vision System for Manipulation Tasks , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[10]  Adrian Rubio Solis,et al.  Bayesian perception of touch for control of robot emotion , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[11]  Thomas B. Sheridan Teleoperation, Telerobotics, and Telepresence: A Progress Report , 1992 .

[12]  Darwin G. Caldwell,et al.  Tele-presence: visual, audio and tactile feedback and control of a twin armed mobile robot , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[13]  Cynthia Breazeal,et al.  MeBot: A robotic platform for socially embodied telepresence , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Luca Benini,et al.  Gesture Recognition Using Wearable Vision Sensors to Enhance Visitors’ Museum Experiences , 2015, IEEE Sensors Journal.

[15]  Tony J. Dodd,et al.  Active Bayesian perception for angle and position discrimination with a biomimetic fingertip , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Shuichi Nishio,et al.  Incorporated identity in interaction with a teleoperated android robot: A case study , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[17]  Ren C. Luo,et al.  Multisensor fusion and integration: approaches, applications, and future research directions , 2002 .

[18]  A. Murray,et al.  Toward a miniature wireless integrated multisensor microsystem for industrial and biomedical applications , 2002 .

[19]  Silvia Coradeschi,et al.  A Review of Mobile Robotic Telepresence , 2013, Adv. Hum. Comput. Interact..

[20]  G. Ballantyne Robotic surgery, telerobotic surgery, telepresence, and telementoring , 2002, Surgical Endoscopy And Other Interventional Techniques.

[21]  Tony J. Dodd,et al.  Active sensorimotor control for tactile exploration , 2017, Robotics Auton. Syst..

[22]  Tetsuo Ono,et al.  Android as a telecommunication medium with a human-like presence , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Brian Cox,et al.  Development of a telepresence controlled ambidextrous robot for space applications , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[24]  Katherine M. Tsui,et al.  Exploring use cases for telepresence robots , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[25]  Sungchul Kang,et al.  Wearable haptic-based multi-modal teleoperation of field mobile manipulator for explosive ordnance disposal , 2005, IEEE International Safety, Security and Rescue Rototics, Workshop, 2005..

[26]  Tobias Ortmaier,et al.  Toward High-Fidelity Telepresence in Space and Surgery Robotics , 2004, Presence: Teleoperators & Virtual Environments.

[27]  Reid G. Simmons,et al.  Affective social robots , 2010, Robotics Auton. Syst..

[28]  Naoyuki Kubota,et al.  Multimodal Communication for Human-Friendly Robot Partners in Informationally Structured Space , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[29]  Jin Zhang,et al.  Development of a Virtual Platform for Telepresence Control of an Underwater Manipulator Mounted on a Submersible Vehicle , 2017, IEEE Transactions on Industrial Electronics.

[30]  Thomas B. Sheridan,et al.  Telerobotics , 1989, Autom..

[31]  John G. Webster,et al.  Telepresence for touch and proprioception in teleoperator systems , 1988, IEEE Trans. Syst. Man Cybern..

[32]  Giorgio Metta,et al.  Active contour following to explore object shape with robot touch , 2013, 2013 World Haptics Conference (WHC).

[33]  Francesco Piazza,et al.  Intelligent Acoustic Interfaces With Multisensor Acquisition for Immersive Reproduction , 2015, IEEE Transactions on Multimedia.

[34]  Uriel Martinez-Hernandez Tactile Sensors , 2015, Scholarpedia.

[35]  Uriel Martinez-Hernandez,et al.  Expressive touch: Control of robot emotional expression by touch , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[36]  Yoseph Bar-Cohen,et al.  Haptic Devices for Virtual Reality, Telepresence, and Human-Assistive Robotics , 2003 .

[37]  Giulio Sandini,et al.  The iCub humanoid robot: An open-systems platform for research in cognitive development , 2010, Neural Networks.

[38]  Li Sun,et al.  3D Hand Tracking With Head Mounted Gaze-Directed Camera , 2014, IEEE Sensors Journal.

[39]  Maja J. Mataric,et al.  The role of physical embodiment in human-robot interaction , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[40]  Brian R. Duffy,et al.  Anthropomorphism and the social robot , 2003, Robotics Auton. Syst..

[41]  Susumu Tachi,et al.  Tele-existence master-slave system for remote manipulation , 1990, EEE International Workshop on Intelligent Robots and Systems, Towards a New Frontier of Applications.

[42]  H. Deng,et al.  Building artificial humans to understand humans , 2007, Journal of Artificial Organs.

[43]  Susumu Tachi,et al.  Tele-existence master-slave system for remote manipulation. II , 1990, 29th IEEE Conference on Decision and Control.

[44]  H. G. Stassen,et al.  Telemanipulation and Telepresence , 1995 .