Humanoid navigation planning using future perceptive capability

We present an approach to navigation planning for humanoid robots that aims to ensure reliable execution by augmenting the planning process to reason about the robotpsilas ability to successfully perceive its environment during operation. By efficiently simulating the robotpsilas perception system during search, our planner generates a metric, the so-called perceptive capability, that quantifies the dasiasensabilitypsila of the environment in each state given the task to be accomplished. We have applied our method to the problem of planning robust autonomous walking sequences as performed by an HRP-2 humanoid. A fast GPU-accelerated 3D tracker is used for perception, with a footstep planner incorporating reasoning about the robotpsilas perceptive capability. When combined with a controller capable of adaptively adjusting the height of swing leg trajectories, HRP-2 is able to navigate around obstacles and climb stairs in dynamically changing environments. Reasoning about the future perceptive capability ensures that sensing remains operational throughout the walking sequence and yields higher task success rates than perception-unaware planning.

[1]  Peter K. Allen,et al.  Constraint-Based Sensor Planning for Scene Modeling , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Sebastian Thrun,et al.  Coastal Navigation with Mobile Robots , 1999, NIPS.

[3]  Simon Lacroix,et al.  PG2P: a perception-guided path planning approach for long range autonomous navigation in unknown natural environments , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[4]  Martial Hebert,et al.  Extending the Path-Planning Horizon , 2007, Int. J. Robotics Res..

[5]  Joel Chestnutt,et al.  Navigation planning for legged robots , 2007 .

[6]  Alexei Makarenko,et al.  Information based adaptive robotic exploration , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Rüdiger Dillmann,et al.  Planning for robust execution of humanoid motions using future perceptive capability , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  R. Sharma,et al.  Practical motion planning with visual constraints , 1997, Proceedings of the 1997 IEEE International Symposium on Assembly and Task Planning (ISATP'97) - Towards Flexible and Agile Assembly and Manufacturing -.

[9]  Héctor H. González-Baños,et al.  Motion strategies for maintaining visibility of a moving target , 1997, Proceedings of International Conference on Robotics and Automation.

[10]  Nidhi Kalra,et al.  Constrained Exploration for Studies in Multirobot Coordination , 2006 .

[11]  Danica Kragic,et al.  Measurement errors in visual servoing , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[12]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[13]  Takeo Kanade,et al.  GPU-accelerated real-time 3D tracking for humanoid locomotion and stair climbing , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Richard Pito,et al.  A sensor-based solution to the "next best view" problem , 1996, Proceedings of 13th International Conference on Pattern Recognition.