Towards Autonomous Fixed-Wing Unmanned Aerial Vehicle Landing: A Vision-Aided Inertial Navigation under Sensor Reconfiguration Scenario

Abstract While autonomous landing of unmanned aerial vehicles (UAVs) requires accurate position estimation, the standard inertial navigation unit (INU, the inertial measurement unit with a global positioning system (GPS)) provides relatively poor accuracy in altitude estimation. A common solution for this problem is to aid the INU with additional sensors and/or ground infrastructures, but the main hurdles to the approach are the limited payload of UAVs and extra cost involved. Dynamic sensor reconfiguration can be a good alternative by constructing a new sensor system utilizing available sensors around without adding new sensory equipment to UAVs. In this paper, a sensor reconfiguration scenario for autonomous fixed-wing UAV landing is considered and the resulting vision-aided inertial navigation system is investigated. This paper presents (i) a sensor fusion algorithm for a passive monocular camera and an INU based on the Extended Kalman Filter (EKF), and (ii) an object-detection vision algorithm using optical flow. The EKF is chosen to take care of the nonlinearities in the vision system, and the optical flow is used to robustly detect the UAV from noisy background. Pilot-controlled landing experiments on a NASA UAV platform and the filter simulations were performed to validate the feasibility of the proposed approach. Promising results were obtained showing 50%-80% error reduction in altitude estimation.

[1]  Alison A. Proctor,et al.  Vision-only Approach and Landing , 2005 .

[2]  Bradford W. Parkinson,et al.  AUTOLANDING A 737 USING GPS INTEGRITY BEACONS , 1995 .

[3]  Corey Ippolito,et al.  Topological Constructs for Automatic Reconfiguration of Polymorphic Control Systems , 2007 .

[4]  Mohinder S. Grewal,et al.  Kalman Filtering: Theory and Practice Using MATLAB , 2001 .

[5]  Timothy W. McLain,et al.  Performance Evaluation of Vision-Based Navigation and Landing on a Rotorcraft Unmanned Aerial Vehicle , 2007, 2007 IEEE Workshop on Applications of Computer Vision (WACV '07).

[6]  Michael A. Goodrich,et al.  Towards real-world searching with fixed-wing mini-UAVs , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  G. Pisanich,et al.  Fielding an amphibious UAV: development, results, and lessons learned , 2002, Proceedings. The 21st Digital Avionics Systems Conference.

[8]  Bradford W. Parkinson,et al.  Autolanding a 737 using GPS and Integrity Beacons , 1995, Proceedings of 14th Digital Avionics Systems Conference.

[9]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[10]  Robert Grover Brown,et al.  Introduction to random signals and applied Kalman filtering : with MATLAB exercises and solutions , 1996 .

[11]  Paul Y. Oh,et al.  Autonomous Landing for Indoor Flying Robots Using Optic Flow , 2003 .

[12]  Andrzej Tomczyk,et al.  Preliminary Project of the Autonomous Landing System for Unmanned Aircraft , 1999 .

[13]  Brendan McCane,et al.  On Benchmarking Optical Flow , 2001, Comput. Vis. Image Underst..

[14]  Lars Krüger,et al.  Towards Vision-Based Autonomous Landing for Small UAVs - First Experimental Results of the Vision System , 2007, J. Aerosp. Comput. Inf. Commun..