Vision-based navigation of unmanned aerial vehicles

This paper presents a vision-based navigation strategy for a vertical take-off and landing (VTOL) unmanned aerial vehicle (UAV) using a single embedded camera observing natural landmarks. In the proposed approach, images of the environment are first sampled, stored and organized as a set of ordered key images (visual path) which provides a visual memory of the environment. The robot navigation task is then defined as a concatenation of visual path subsets (called visual route) linking the current observed image and a target image belonging to the visual memory. The UAV is controlled to reach each image of the visual route using a vision-based control law adapted to its dynamic model and without explicitly planning any trajectory. This framework is largely substantiated by experiments with an X4-flyer equipped with a fisheye camera.

[1]  François Chaumette,et al.  Visual Servoing of an Airplane for Alignment with respect to a Runway , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[2]  Guoqiang Hu,et al.  A quaternion formulation for homography-based visual servo control , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[3]  Philippe Martinet,et al.  Efficient hierarchical localization method in an omnidirectional images memory , 2008, 2008 IEEE International Conference on Robotics and Automation.

[4]  Robert E. Mahony,et al.  Attitude estimation on SO[3] based on direct inertial measurements , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[5]  Tomás Svoboda,et al.  Epipolar Geometry for Central Catadioptric Cameras , 2002, International Journal of Computer Vision.

[6]  Per-Olof Persson,et al.  A Simple Mesh Generator in MATLAB , 2004, SIAM Rev..

[7]  Michel Dhome,et al.  Monocular Vision for Mobile Robot Localization and Autonomous Navigation , 2007, International Journal of Computer Vision.

[8]  José Jesús Guerrero,et al.  Topological and Metric Robot Localization through Computer Vision Techniques , 2008 .

[9]  Kostas Daniilidis,et al.  Mirrors in motion: epipolar geometry and motion estimation , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[10]  Shree K. Nayar,et al.  A Theory of Single-Viewpoint Catadioptric Image Formation , 1999, International Journal of Computer Vision.

[11]  Robert E. Mahony,et al.  A Practical Visual Servo Control for an Unmanned Aerial Vehicle , 2008, IEEE Transactions on Robotics.

[12]  Philippe Martinet,et al.  A generic fisheye camera model for robotic applications , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Luc Van Gool,et al.  Feature based omnidirectional sparse visual path following , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Philippe Martinet,et al.  Autonomous Navigation of Vehicles from a Visual Memory Using a Generic Camera Model , 2009, IEEE Transactions on Intelligent Transportation Systems.

[15]  Rogelio Lozano,et al.  DYNAMIC MODELLING AND CONFIGURATION STABILIZATION FOR AN X4-FLYER. , 2002 .

[16]  Tinne Tuytelaars,et al.  Fast wide baseline matching for visual navigation , 2004, CVPR 2004.

[17]  David Nistér,et al.  An efficient solution to the five-point relative pose problem , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Tarek Hamel,et al.  Control Laws For The Tele Operation Of An Unmanned Aerial Vehicle Known As An X4-flyer , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  João Pedro Barreto,et al.  A unifying geometric representation for central projection systems , 2006, Comput. Vis. Image Underst..

[20]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[21]  Masayuki Inaba,et al.  Visual navigation using omnidirectional view sequence , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[22]  José Santos-Victor,et al.  Vision-based navigation and environmental representations with an omnidirectional camera , 2000, IEEE Trans. Robotics Autom..

[23]  Jean-Arcady Meyer,et al.  2D Simultaneous Localization And Mapping for Micro Air Vehicles , 2006 .

[24]  E. Frew,et al.  Adaptive Planning Horizon Based on Information Velocity for Vision-Based Navigation , 2007 .

[25]  Christopher G. Harris,et al.  A Combined Corner and Edge Detector , 1988, Alvey Vision Conference.