Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks

Although gesture-based input and augmented reality (AR) facilitate intuitive human-robot interactions (HRI), prior implementations have relied on research-grade hardware and software. This paper explores using tablets to render mixed-reality visual environments that support human-robot collaboration for object manipulation. A mobile interface is created on a tablet by integrating real-time vision, 3D graphics, touchscreen interaction, and wireless communication. This mobile interface augments a live video of physical objects in a robot's workspace with corresponding virtual objects that can be manipulated by a user to intuitively command the robot to manipulate the physical objects. By generating the mixed-reality environment on an exocentric view provided by the tablet camera, the interface establishes a common frame of reference for the user and the robot to effectively communicate spatial information for object manipulation. After addressing challenges due to limitations in mobile sensing and computation, the interface is evaluated with participants to examine the performance and user experience with the suggested approach.

[1]  Jörg Stückler,et al.  Adjustable autonomy for mobile teleoperation of personal service robots , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[2]  J. Geoffrey Chase,et al.  Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design , 2008 .

[3]  Nina S. T. Hirata,et al.  Fast QR Code Detection in Arbitrarily Acquired Images , 2011, 2011 24th SIBGRAPI Conference on Graphics, Patterns and Images.

[4]  Ashutosh Saxena,et al.  Robotic Grasping of Novel Objects using Vision , 2008, Int. J. Robotics Res..

[5]  Yu-Hsuan Su,et al.  Manipulation System Design for Industrial Robot Manipulators Based on Tablet PC , 2015, ICIRA.

[6]  Shumin Zhai,et al.  Applications of augmented reality for human-robot communication , 1993, Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '93).

[7]  Daniel Leidner,et al.  A knowledge-driven shared autonomy human-robot interface for tablet computers , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[8]  T. Igarashi,et al.  TouchMe : An Augmented Reality Based Remote Robot Manipulation , 2011 .

[9]  Michael A. Goodrich,et al.  Seven principles of efficient human robot interaction , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[10]  N. Malhotra QUESTIONNAIRE DESIGN AND SCALE DEVELOPMENT , 2005 .

[11]  Vikram Kapila,et al.  Interactive mobile interface with augmented reality for learning digital control concepts , 2016, 2016 Indian Control Conference (ICC).

[12]  Daniel Lélis Baggio,et al.  Mastering OpenCV with Practical Computer Vision Projects , 2012 .

[13]  H. Yanco,et al.  Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces , 2009, ITS '09.

[14]  Sebastian Thrun,et al.  A Gesture Based Interface for Human-Robot Interaction , 2000, Auton. Robots.