Improving monocular SLAM with altimeter hints for fixed-wing aircraft navigation and emergency landing

This paper presents a visual Simultaneous Localization And Mapping (SLAM) method for temporary satellite dropout navigation for an unpowered fixed-wing aircraft. It is designed for flight altitudes beyond typical stereo ranges, but within the range of distance measurement sensors. The proposed visual SLAM method consists of a common localization step with monocular camera resectioning, and a mapping step which incorporates radar altimeter data for absolute scale estimation. With that, there will be no scale drift of the map and of the estimated flight path. The method does not require simplifications like known landmarks and it is thus suitable for unknown and nearly arbitrary terrain. The method is tested with sensor datasets from a manned Cessna 172 aircraft. With 5% absolute scale error from radar measurements causing approximately 2-6% accumulation error over the flown distance, stable positioning is achieved over several minutes of flight time. The main limitations are flight altitudes above the radar range of 750 m where the monocular method will suffer from scale drift, and, depending on the flight speed, flights below 50 m where image processing gets difficult with a downwards-looking camera due to the high optical flow rates and the low image overlap.

[1]  Frank Thielecke,et al.  A Vision-Based Navigation Algorithm for a VTOL- UAV , 2006 .

[2]  Darius Burschka,et al.  State estimation for highly dynamic flying systems using key frame odometry with varying time delays , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Ji Zhang,et al.  INS Assisted Monocular Visual Odometry for Aerial Vehicles , 2013, FSR.

[4]  A Cherian,et al.  Motion estimation of a miniature helicopter using a single onboard camera , 2010, Proceedings of the 2010 American Control Conference.

[5]  Farid Kendoul,et al.  Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems , 2012, J. Field Robotics.

[6]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[7]  Friedrich Fraundorfer,et al.  Visual Odometry Part I: The First 30 Years and Fundamentals , 2022 .

[8]  Larry H. Matthies,et al.  Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[9]  Luis Mejías Alvarez,et al.  Enabling Aircraft Emergency Landings Using Active Visual Site Detection , 2013, FSR.

[10]  Rodney A. Walker,et al.  Airborne systems laboratory for automation research , 2010 .

[11]  Franz Andert,et al.  Lidar-Aided Camera Feature Tracking and Visual SLAM for Spacecraft Low-Orbit Navigation and Planetary Landing , 2015 .

[12]  Richard I. Hartley,et al.  A Fast Optimal Algorithm for L 2 Triangulation , 2007, ACCV.

[13]  Luis Mejías Alvarez,et al.  Error analysis and attitude observability of a monocular GPS/visual odometry integrated navigation filter , 2012, Int. J. Robotics Res..

[14]  Franz Andert,et al.  On the safe navigation problem for unmanned aircraft: Visual odometry and alignment optimizations for UAV positioning , 2014, 2014 International Conference on Unmanned Aircraft Systems (ICUAS).

[15]  Vijay Kumar,et al.  Vision-based state estimation for autonomous rotorcraft MAVs in complex environments , 2013, 2013 IEEE International Conference on Robotics and Automation.

[16]  Takeo Kanade,et al.  A visual odometer for autonomous helicopter flight , 1999, Robotics Auton. Syst..

[17]  Roland Siegwart,et al.  Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[18]  Ben Upcroft,et al.  High Altitude Stereo Visual Odometry , 2013, Robotics: Science and Systems.