Cooperative navigation in GPS-challenging environments exploiting position broadcast and vision-based tracking

This paper presents an algorithm for improving navigation performance of an Unmanned Aerial Vehicle (UAV) in GPS-challenging environments, which exploits aiding measurements from one or more cooperative UAVs flying under full GPS coverage. In particular, sensor fusion is based on an Extended Kalman Filter that integrates measurements from onboard inertial sensors and magnetometers, available GPS pseudoranges, position information from cooperative UAVs, and line-of-sight estimated by vision-based tracking. Performance evaluation is carried out considering a two-vehicle formation and using covariance propagation techniques, while experimental platforms are being integrated and will be used for proof-of-concept flight demonstration. Achieved results show that available pseudorange measurements, or proper dynamics of the UAV under GPS coverage, can compensate for the intrinsic limitations of single line-of-sight aiding and ensure a significant impact on navigation performance.