Video and Laser Based Augmented Reality Stereoscopic Viewing for Mobile Robot Teleoperation

Abstract This paper proposes an augmented reality visualization interface to simultaneously present visual and laser sensors information further enhanced by stereoscopic viewing. The use of augmented layers is proposed to represent laser measurements suitably aligned to video information. This methodology enables an operator to intuitively comprehend object proximity and to respond in an accurate and timely manner. The use of augmented reality to assist teleoperation, sometime discussed in the literature, is here proposed following a systematic approach and developed based on authors' previous work on stereoscopic teleoperation. The approach is experimented on a real telerobotic system where a user operates a mobile robot located thousands kilometers away. The result proved feasibility and simplicity of the proposed methodology and it represents a base for further studies.

[1]  Charles E. Thorpe,et al.  Remote Driving With a Multisensor User Interface , 2000 .

[2]  Giovanni Muscato,et al.  Toward a mobile autonomous robotic system for Mars exploration , 2004 .

[3]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[4]  Ulrich Neumann,et al.  Cognitive, performance, and systems issues for augmented reality applications in manufacturing and maintenance , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).

[5]  Michael A. Goodrich,et al.  Ecological Interfaces for Improving Mobile Robot Teleoperation , 2007, IEEE Transactions on Robotics.

[6]  Holly A. Yanco,et al.  "Where am I?" Acquiring situation awareness using a remote robot platform , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[7]  Andrew Y. C. Nee,et al.  Robot programming using augmented reality: An interactive method for planning collision-free paths , 2009 .

[8]  Giovanni Muscato,et al.  An Overview of the “Volcan Project”: An UAS for Exploration of Volcanic Environments , 2009, J. Intell. Robotic Syst..

[9]  J. Molineros,et al.  Computer vision for guiding manual assembly , 2001, Proceedings of the 2001 IEEE International Symposium on Assembly and Task Planning (ISATP2001). Assembly and Disassembly in the Twenty-first Century. (Cat. No.01TH8560).

[10]  John Pretlove,et al.  Augmented reality for programming industrial robots , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[11]  Giovanni Muscato,et al.  Mobile robotic teleguide based on video images , 2008, IEEE Robotics & Automation Magazine.

[12]  Robin R. Murphy,et al.  Human-robot interaction in rescue robotics , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[13]  Binggang Cao,et al.  Communication Mechanism Study of a Multi-Robot Planetary Exploration System , 2006, 2006 IEEE International Conference on Robotics and Biomimetics.

[14]  Jean Scholtz,et al.  Evaluation of human-robot interaction awareness in search and rescue , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[15]  Giovanni Muscato,et al.  ROBOVOLC: a robot for volcano exploration result of first test campaign , 2003, Ind. Robot.

[16]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Holly A. Yanco,et al.  Improved interfaces for human-robot interaction in urban search and rescue , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).