Comparing Virtual Reality Interfaces for the Teleoperation of Robots

Whether exploring a defunct nuclear reactor, defusing a bomb, delivering medicine to quarantined patients, repairing the International Space Station from the outside, or providing dexterous manipulation for those with motor impairments, robots have the ability to be in places where humans cannot go, can augment the capabilities of humans, and improve quality of life and work. Since even the most advanced robots have difficulty completing tasks that require grasping and manipulation, human teleoperation is often a practical alternative for these types of tasks. By importing the dexterity, expertise, and wealth of background knowledge of a human operator, robots can leverage the skills of their human teammates without requiring humans to be physically present. However, existing robot teleoperation interfaces often rely on 2D methods to view and interact with the 3D world, which is cumbersome for human operators. Virtual reality interfaces may be suitable for resolving problems with traditional teleoperation interfaces (e.g., perspective adjustment, action specification). The goal of this research was to investigate the efficacy of using two different Virtual Reality interfaces—positional control, similar to waypoint navigation, and trajectory control, similar to click and drag—for remotely controlling a Baxter robot to complete a variety of dexterous manipulation tasks. The results of this study will help us to develop control interfaces that allow for more intuitive robot manipulation and ultimately, better distal collaborations between humans and robots.

[1]  Daniela Rus,et al.  Baxter's Homunculus: Virtual Reality Spaces for Teleoperation in Manufacturing , 2017, IEEE Robotics and Automation Letters.

[2]  Siddhartha S. Srinivasa,et al.  The YCB object and Model set: Towards common benchmarks for manipulation research , 2015, 2015 International Conference on Advanced Robotics (ICAR).

[3]  Toshiyuki Inagaki,et al.  Adaptive Automation: Sharing and Trading of Control , 2001 .

[4]  David Whitney,et al.  Communicating Robot Arm Motion Intent Through Mixed Reality Head-mounted Displays , 2017, ISRR.

[5]  Robin R. Murphy,et al.  Moonlight in Miami: a field study of human-robot interaction in the context of an urban search and rescue disaster response training exercise , 2004 .

[6]  Mark W. Scerbo,et al.  Adaptive Automation , 2006, Neuroergonomics.

[7]  J. E. Colgate,et al.  Cobots: Robots for Collaboration With Human Operators , 1996, Dynamic Systems and Control.

[8]  R. Parasuraman,et al.  Psychophysiology and adaptive automation , 1996, Biological Psychology.

[9]  Edgar Erdfelder,et al.  G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences , 2007, Behavior research methods.

[10]  David Whitney,et al.  Comparing Robot Grasping Teleoperation Across Desktop and Virtual Reality with ROS Reality , 2017, ISRR.

[11]  Valerie K. Sims,et al.  Development of Gesture-based Commands for Natural User Interfaces , 2017 .

[12]  Gert Kootstra,et al.  International Conference on Robotics and Automation (ICRA) , 2008, ICRA 2008.

[13]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[14]  Robin R. Murphy,et al.  Moonlight in Miami : A Field Study of Human-Robot Interaction in the Context of an Urban Search and Rescue Disaster Response Training Exercise , 2003 .

[15]  Stefanie Tellex,et al.  End-User Robot Programming Using Mixed Reality , 2019, 2019 International Conference on Robotics and Automation (ICRA).