Real-time 3D SLAM for Humanoid Robot considering Pattern Generator Information

Humanoid robotics and SLAM (simultaneous localisation and mapping) are certainly two of the most significant themes of the current worldwide robotics research effort, but the two fields have up until now largely run independent parallel paths, despite the obvious benefit to be gained in joining the two. The next major step forward in humanoid robotics will be increased autonomy, and the ability of a robot to create its own world map on the fly will be a significant enabling technology. Meanwhile, SLAM techniques have found most success with robot platforms and sensor configurations which are outside of the humanoid domain. Humanoid robots move with high linear and angular accelerations in full 3D, and normally only vision is available as an outward-looking sensor. Building on recently published work on monocular SLAM using vision, and on pattern generation, we show that real-time SLAM for a humanoid can indeed be achieved. Using HRP-2, we present results in which a sparse 3D map of visual landmarks is acquired on the fly using a single camera and demonstrated loop closing and drift-free 3D motion estimation within a typical cluttered indoor environment. This is achieved by tightly coupling the pattern generator, the robot odometry and inertial sensing to aid visual mapping within a standard EKF framework. To our knowledge this is the first implementation of real-time 3D SLAM for a humanoid robot able to demonstrate loop closing

[1]  Michael Bosse,et al.  An Atlas framework for scalable mapping , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[2]  Olivier Stasse,et al.  Three Characterizations of 3D Reconstruction Uncertainty with Bounded Error , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[3]  Simon Lacroix,et al.  High resolution terrain mapping using low attitude aerial stereo imagery , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[4]  Salah Sukkarieh,et al.  Airborne simultaneous localisation and map building , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[5]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[6]  Günther Schmidt,et al.  Intelligent gaze control for vision-guided humanoid walking: methodological aspects , 2004, Robotics Auton. Syst..

[7]  David Nistér,et al.  Preemptive RANSAC for live structure and motion estimation , 2005, Machine Vision and Applications.

[8]  David W. Murray,et al.  Mobile Robot Localisation Using Active Vision , 1998, ECCV.

[9]  Hanumant Singh,et al.  Visually Navigating the RMS Titanic with SLAM Information Filters , 2005, Robotics: Science and Systems.

[10]  Hirochika Inoue,et al.  Using visual odometry to create 3D maps for online footstep planning , 2005, 2005 IEEE International Conference on Systems, Man and Cybernetics.

[11]  Kazuhito Yokoi,et al.  1A1-E05 Operation Method of Humanoid Robot "HRP-2 No.10" with Human Supervision , 2005 .

[12]  Kazuhito Yokoi,et al.  Biped walking pattern generation by using preview control of zero-moment point , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[13]  Satoshi Kagami,et al.  Humanoid robot localisation using stereo vision , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[14]  Kazuhito Yokoi,et al.  Experimental evaluation of the dynamic simulation of biped walking of humanoid robots , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[15]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[16]  Yolanda González Cid,et al.  Real-time 3d SLAM with wide-angle vision , 2004 .