Terminal phase visual position estimation for a tail-sitting vertical takeoff and landing UAV via a Kalman filter

Computer vision has been an active field of research for many decades; it has also become widely used for airborne applications in the last decade or two. Much airborne computer vision research has focused on navigation for Unmanned Air Vehicles; this paper presents a method to estimate the full 3D position information of a UAV by integrating visual cues from one single image with data from an Inertial Measurement Unit under the Kalman Filter formulation. Previous work on visual 3D position estimation for UAV landing has been achieved by using 2 or more frames of image data with feature enriched information in the image; however raw vision state estimates are hugely suspect to image noise. This paper uses a rather conventional type of landing pad with visual features extracted for use in the Kalman filter to obtain optimal 3D position estimates. This methodology promises to provide state estimates that are better suited for guidance and control of a UAV. This also promise autonomous landing of UAVs without GPS information to be conducted. The result of this implementation tested with flight images is presented.

[1]  Peter W. Gibbens,et al.  Flight Testing of the T-Wing Tail-Sitter Unmanned Air Vehicle , 2008 .

[2]  S. Shankar Sastry,et al.  LANDING AN UNMANNED AIR VEHICLE: VISION BASED MOTION ESTIMATION AND NONLINEAR CONTROL , 1999 .

[3]  S. Shankar Sastry,et al.  An Invitation to 3-D Vision , 2004 .

[4]  Eric N. Johnson,et al.  Vision-Aided Inertial Navigation for Flight Control , 2005 .

[5]  Frank L. Lewis,et al.  Aircraft Control and Simulation , 1992 .

[6]  Peter W. Gibbens,et al.  Visual Position Estimation for Automatic Landing of a Tail-Sitter Vertical Takeoff and Landing Unmanned Air Vehicle , 2008 .

[7]  Stergios I. Roumeliotis,et al.  Vision‐aided inertial navigation for pin‐point landing using observations of mapped landmarks , 2007, J. Field Robotics.

[8]  Frank Thielecke,et al.  A Vision-Based Navigation Algorithm for a VTOL- UAV , 2006 .

[9]  Ming-Kuei Hu,et al.  Visual pattern recognition by moment invariants , 1962, IRE Trans. Inf. Theory.

[10]  James Roberts,et al.  Vertical Flight Testing and Parameter Identification for the T-Wing UAV , 2007 .

[11]  S. Shankar Sastry,et al.  Multiple view motion estimation and control for landing an unmanned aerial vehicle , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[12]  Peter W. Gibbens,et al.  Enhancement of learning in aircraft handling qualities through variable stability flight simulation , 2009 .

[13]  S. Kennedy,et al.  Architecture and System Performance of SPAN -NovAtel's GPS/INS Solution , 2006, 2006 IEEE/ION Position, Location, And Navigation Symposium.

[14]  Ching-Fang Lin,et al.  Modern Navigation, Guidance, And Control Processing , 1991 .

[15]  Frank Thielecke,et al.  Navigation of a Low Flying VTOL Aircraft With the Help of a Downwards Pointing Camera , 2004 .

[16]  Gaurav S. Sukhatme,et al.  Landing on a Moving Target Using an Autonomous Helicopter , 2003, FSR.

[17]  Peter W. Gibbens,et al.  Terminal Phase Vision-Based Target Recognition and 3D Pose Estimation for a Tail-Sitter, Vertical Takeoff and Landing Unmanned Air Vehicle , 2006, PSIVT.

[18]  Wen-Hsiang Tsai,et al.  Using parallel line information for vision-based landmark location estimation and an application to automatic helicopter landing , 1998 .