First-person tele-operation of a humanoid robot

Remote control of robots is often necessary to complete complex unstructured tasks in environments that are inaccessible (e.g. dangerous) for humans. Tele-operation of humanoid robots is often performed trough motion tracking to reduce the complexity deriving from manually controlling a high number of DOF. However, most commercial motion tracking apparatus are expensive and often uncomfortable. Moreover, a limitation of this approach is the need to maintain visual contact with the operated robot, or to employ a second human operator to independently maneuver a camera. As a result, even performing simple tasks heavily depends on the skill and synchronization of the two operators. To alleviate this problem we propose to use augmented-reality to provide the operator with first-person vision and a natural interface to directly control the camera, and at the same time the robot. By integrating recent off-the-shelf technologies, we provide an affordable and intuitive environment composed of Microsoft Kinect, Oculus Rift and haptic SensorGlove to tele-operate in first-person humanoid robots. We demonstrate on the humanoid robot iCub that this set-up allows to quickly and naturally accomplish complex tasks.

[1]  Jianwei Zhang,et al.  Precision grasp synergies for dexterous robotic hands , 2013, 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[2]  Maren Bennewitz,et al.  Real-time imitation of human whole-body motions by humanoids , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Bram Vanderborght,et al.  Developing new frontiers in the Rubber Hand Illusion: Design of an open source robotic hand to better understand prosthetics , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[4]  Joseph J. LaViola,et al.  Exploring head tracked head mounted displays for first person robot teleoperation , 2014, IUI.

[5]  Giulio Sandini,et al.  The iCub humanoid robot: an open platform for research in embodied cognition , 2008, PerMIS.

[6]  Andrew Howard,et al.  Design and use paradigms for Gazebo, an open-source multi-robot simulator , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[7]  C. Stanton,et al.  Teleoperation of a humanoid robot using full-body motion capture , example movements , and machine learning , 2012 .

[8]  Pedro Arias,et al.  Metrological comparison between Kinect I and Kinect II sensors , 2015 .

[9]  Jan Peters,et al.  Low-cost Sensor Glove with Force Feedback for Learning from Demonstrations using Probabilistic Trajectory Representations , 2015, ArXiv.

[10]  Keita Higuchi,et al.  Flying head: A head-synchronization mechanism for flying telepresence , 2013, 2013 23rd International Conference on Artificial Reality and Telexistence (ICAT).

[11]  Ana-Maria Cretu,et al.  Human arm motion imitation by a humanoid robot , 2014, 2014 IEEE International Symposium on Robotic and Sensors Environments (ROSE) Proceedings.

[12]  G. Ballantyne Robotic surgery, telerobotic surgery, telepresence, and telementoring , 2002, Surgical Endoscopy And Other Interventional Techniques.

[13]  Paulo Menezes,et al.  Be the robot: Human embodiment in tele-operation driving tasks , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[14]  Giorgio Metta,et al.  YARP: Yet Another Robot Platform , 2006 .