Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion

The question of how to estimate the state of an unmanned aerial vehicle (UAV) in real time in multi-environments remains a challenge. Although the global navigation satellite system (GNSS) has been widely applied, drones cannot perform position estimation when a GNSS signal is not available or the GNSS is disturbed. In this paper, the problem of state estimation in multi-environments is solved by employing an Extended Kalman Filter (EKF) algorithm to fuse the data from multiple heterogeneous sensors (MHS), including an inertial measurement unit (IMU), a magnetometer, a barometer, a GNSS receiver, an optical flow sensor (OFS), Light Detection and Ranging (LiDAR), and an RGB-D camera. Finally, the robustness and effectiveness of the multi-sensor data fusion system based on the EKF algorithm are verified by field flights in unstructured, indoor, outdoor, and indoor and outdoor transition scenarios.

[1]  Wolfgang Hess,et al.  Real-time loop closure in 2D LIDAR SLAM , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Wei Zheng,et al.  Multi-sensor fusion based pose estimation for unmanned aerial vehicles on ships , 2016, 2016 IEEE International Conference on Information and Automation (ICIA).

[3]  Jianye Liu,et al.  An Improved Multi-Sensor Fusion Navigation Algorithm Based on the Factor Graph , 2017, Sensors.

[4]  Vijay Kumar,et al.  Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Timothy C. Havens,et al.  Heterogeneous Multisensor Fusion for Mobile Platform Three-Dimensional Pose Estimation , 2017 .

[6]  Xiaosu Xu,et al.  A hybrid fusion algorithm for GPS/INS integration during GPS outages , 2017 .

[7]  Joan Solà,et al.  Quaternion kinematics for the error-state Kalman filter , 2015, ArXiv.

[8]  Jitendra R. Raol,et al.  Multi-Sensor Data Fusion with MATLAB® , 2009 .

[9]  P. Pagilla,et al.  Temperature Distribution in Moving Webs Heated by Radiation Panels: Model Development and Experimental Validation , 2017 .

[10]  Michael Stock,et al.  INS/GNSS Integration for Aerobatic Flight Applications and Aircraft Motion Surveying , 2017, Sensors.

[11]  Sebastian Scherer,et al.  A Multi-Sensor Fusion MAV State Estimation from Long-Range Stereo, IMU, GPS and Barometric Sensors , 2016, Sensors.

[12]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[13]  F. Samadzadegan,et al.  Autonomous navigation of Unmanned Aerial Vehicles based on multi-sensor data fusion , 2012, 20th Iranian Conference on Electrical Engineering (ICEE2012).

[14]  Naser El-Sheimy,et al.  The performance analysis of an indoor mobile mapping system with RGB-D sensor , 2015 .

[15]  Quan Quan Introduction to Multicopter Design and Control , 2017 .

[16]  Rongbing Li,et al.  LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments , 2014, 2014 DGON Inertial Sensors and Systems (ISS).

[17]  Ashok Kumar Patil,et al.  A LiDAR and IMU Integrated Indoor Navigation System for UAVs and Its Application in Real-Time Pipeline Classification , 2017, Sensors.

[18]  Sven Behnke,et al.  Autonomous Navigation for Micro Aerial Vehicles in Complex GNSS-denied Environments , 2016, J. Intell. Robotic Syst..

[19]  Renato Ventura Bayan Henriques,et al.  A simplified approach to motion estimation in a UAV using two filters , 2016 .

[20]  Fakhri Karray,et al.  Multisensor data fusion: A review of the state-of-the-art , 2013, Inf. Fusion.

[21]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[22]  T. Başar,et al.  A New Approach to Linear Filtering and Prediction Problems , 2001 .

[23]  Aboelmagd Noureldin,et al.  INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm , 2015, Sensors.

[24]  Abraham Bachrach,et al.  Autonomous flight in unstructured and unknown indoor environments , 2009 .

[25]  John J. Leonard,et al.  Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age , 2016, IEEE Transactions on Robotics.

[26]  Timothy D. Barfoot,et al.  State Estimation for Robotics , 2017 .

[27]  Albert S. Huang,et al.  Visual Odometry and Mapping for Autonomous Flight Using an RGB-D Camera , 2011, ISRR.

[28]  Dominique Paret Multiplexed Networks for Embedded Systems: CAN, LIN, FlexRay, Safe-by-Wire , 2014 .

[29]  Marc Pollefeys,et al.  An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications , 2013, 2013 IEEE International Conference on Robotics and Automation.

[30]  Vijay Kumar,et al.  Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[31]  Farid Kendoul,et al.  Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems , 2012, J. Field Robotics.

[32]  Al Savvaris,et al.  LIDAR-inertial integration for UAV localization and mapping in complex environments , 2016, 2016 International Conference on Unmanned Aircraft Systems (ICUAS).