Robust Autonomous Navigation in a Factory Environment

This paper describes the development and demonstration of a navigation system for a robot traversing a factory-like environment. The system is designed to combine several sensors to increase robustness and maximize the likelihood of successfully following navigation commands. The system uses motion capture, computer vision, and odometry sensors to form a robust estimate of robot motion. Performance degrades gracefully as some sensors become unreliable; however, the robot is able to continue safe operation despite multiple sensor failures. The system was deployed on a ground robot that was commanded to navigate through a course that was configured to represent a worst-case factory environment. The system was demonstrated at Boeing's Collaborative Autonomous Systems Lab in Seattle Washington.

[1]  John Vian,et al.  Aggressive navigation using high-speed natural feature point tracking , 2014, 2014 IEEE Aerospace Conference.

[2]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[3]  Raffaello D'Andrea,et al.  Autonomous quadrotor flight using a vision system and accommodating frames misalignment , 2009, 2009 IEEE International Symposium on Industrial Embedded Systems.

[4]  David W. Murray,et al.  Parallel Tracking and Mapping on a camera phone , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[5]  Vijay Kumar,et al.  Trajectory generation and control for precise aggressive maneuvers with quadrotors , 2012, Int. J. Robotics Res..