Mixed reality for robotics

Mixed Reality can be a valuable tool for research and development in robotics. In this work, we refine the definition of Mixed Reality to accommodate seamless interaction between physical and virtual objects in any number of physical or virtual environments. In particular, we show that Mixed Reality can reduce the gap between simulation and implementation by enabling the prototyping of algorithms on a combination of physical and virtual objects, including robots, sensors, and humans. Robots can be enhanced with additional virtual capabilities, or can interact with humans without sharing physical space. We demonstrate Mixed Reality with three representative experiments, each of which highlights the advantages of our approach. We also provide a testbed for Mixed Reality with three different virtual robotics environments in combination with the Crazyflie 2.0 quadcopter.

[1]  Andrew Howard,et al.  Design and use paradigms for Gazebo, an open-source multi-robot simulator , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[2]  Gaurav S. Sukhatme,et al.  Cooperative Control for Target Tracking with Onboard Sensing , 2014, ISER.

[3]  Surya P. N. Singh,et al.  V-REP: A versatile and scalable robot simulation framework , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Dave Thomas,et al.  Mock Objects , 2002, IEEE Softw..

[5]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.

[6]  Anton Leuski,et al.  All Together Now - Introducing the Virtual Human Toolkit , 2013, IVA.

[7]  Luca Maria Gambardella,et al.  Interactive Augmented Reality for understanding and analyzing multi-robot systems , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Klaus Schilling,et al.  A Spatial Augmented Reality system for intuitive display of robotic data , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Russell M. Taylor,et al.  VRPN: a device-independent, network-transparent VR peripheral system , 2001, VRST '01.

[10]  Eckhard Freund,et al.  Projective virtual reality: bridging the gap between virtual reality and robotics , 1999, IEEE Trans. Robotics Autom..

[11]  Ruzena Bajcsy,et al.  TEEVE: the next generation architecture for tele-immersive environments , 2005, Seventh IEEE International Symposium on Multimedia (ISM'05).

[12]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[13]  Gervasio Varela,et al.  Mixed reality educational environment for robotics , 2011, 2011 IEEE International Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems Proceedings.

[14]  Burkhard Wünsche,et al.  Mixed reality simulation for mobile robots , 2009, 2009 IEEE International Conference on Robotics and Automation.

[15]  Yuyu Xu,et al.  An example-based motion synthesis technique for locomotion and object manipulation , 2012, I3D '12.

[16]  Scott S. Fisher,et al.  Head-coupled remote stereoscopic camera system for telepresence applications , 1990, Other Conferences.