Cross-Drone Binocular Coordination for Ground Moving Target Tracking in Occlusion-Rich Scenarios

How to work effectively under occlusion-rich environments remains a challenge for airborne vision-based ground target tracking, due to the natural limitation of monocular vision. Given this, a novel cross-drone binocular coordination approach, inspired by the efficient coordination of human eyes, is proposed and developed. The idea, derived from neural models of the human visual system, is to utilize distributed target measurements to overcome occlusion effects. Eventually, a binocular coordination controller is developed. It enables two distributed pan-tilt cameras to execute synergistic movements similar to human eyes. The proposed approach is able to work based on binocular or monocular vision, and hence it is practically appropriate for various environments. Both testbed experiments and field experiments are conducted for performance evaluation. Testbed experiments highlight its advantages over independent tracking in terms of accuracy while being robust to a partial perception ratio of up to 43%. Field experiments with a pair of drones further demonstrate its effectiveness in the real-world scenarios.

[1]  Lihua Xie,et al.  Decentralized Multi-UAV Flight Autonomy for Moving Convoys Search and Track , 2017, IEEE Transactions on Control Systems Technology.

[2]  L M Optican,et al.  Saccade-vergence interactions in humans. , 1992, Journal of neurophysiology.

[3]  Naira Hovakimyan,et al.  Rapid Motion Estimation of a Target Moving with Time-Varying Velocity , 2007 .

[4]  Tor Arne Johansen,et al.  Tracking of Ocean Surface Objects from Unmanned Aerial Vehicles with a Pan/Tilt Unit using a Thermal Camera , 2018, J. Intell. Robotic Syst..

[5]  Agostino Gibaldi,et al.  Binocular Eye Movements Are Adapted to the Natural Environment , 2019, The Journal of Neuroscience.

[6]  Mark Campbell,et al.  Decentralized Geolocation and Bias Estimation for Uninhabited Aerial Vehicles with Articulating Cameras , 2011 .

[7]  Kazunori Umeda,et al.  Occlusion handling for a target-tracking robot with a stereo camera , 2018 .

[8]  Xiaolin Zhang,et al.  Cooperative movements of binocular motor system , 2008, 2008 IEEE International Conference on Automation Science and Engineering.

[9]  Ali Farhadi,et al.  YOLOv3: An Incremental Improvement , 2018, ArXiv.

[10]  Pratap Tokekar,et al.  Sensor planning for a symbiotic UAV and UGV system for precision agriculture , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  W. King,et al.  Binocular coordination of eye movements – Hering’s Law of equal innervation or uniocular control? , 2011, The European journal of neuroscience.

[12]  Dongbing Gu,et al.  Cooperative Target Tracking Control of Multiple Robots , 2012, IEEE Transactions on Industrial Electronics.

[13]  Shin Ishii,et al.  The State-of-the-Art in Handling Occlusions for Visual Object Tracking , 2015, IEICE Trans. Inf. Syst..

[14]  Hyondong Oh,et al.  Coordinated standoff tracking of moving target groups using multiple UAVs , 2015, IEEE Transactions on Aerospace and Electronic Systems.

[15]  Shin Ishii,et al.  An occlusion-aware particle filter tracker to handle complex and persistent occlusions , 2016, Computer Vision and Image Understanding.

[16]  Robert Fitch,et al.  Robotic ecology: Tracking small dynamic animals with an autonomous aerial vehicle , 2018, Science Robotics.

[17]  Fabio Morbidi,et al.  Active Target Tracking and Cooperative Localization for Teams of Aerial Vehicles , 2013, IEEE Transactions on Control Systems Technology.

[18]  Lian Pin Koh,et al.  Drones count wildlife more accurately and precisely than humans , 2017, bioRxiv.

[19]  Yang Song,et al.  An active binocular integrated system for intelligent robot vision , 2012, 2012 IEEE International Conference on Intelligence and Security Informatics.

[20]  Heinrich H. Bülthoff,et al.  Deep Neural Network-Based Cooperative Visual Tracking Through Multiple Micro Aerial Vehicles , 2018, IEEE Robotics and Automation Letters.

[21]  Luca Schenato,et al.  To Zero or to Hold Control Inputs With Lossy Links? , 2009, IEEE Transactions on Automatic Control.

[22]  W. King,et al.  Neural Basis of Disjunctive Eye Movements , 2002, Annals of the New York Academy of Sciences.

[23]  Simon Lacroix,et al.  Multi-robot target detection and tracking: taxonomy and survey , 2016, Auton. Robots.

[24]  Derek H. Arnold,et al.  Binocular switch suppression: A new method for persistently rendering the visible ‘invisible’ , 2008, Vision Research.

[25]  Alain Berthoz,et al.  Head-eyes system and gaze analysis of the humanoid robot Romeo , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[26]  Michael C. Hatfield,et al.  Unmanned aircraft systems in wildlife research: current and future applications of a transformative technology , 2016 .

[27]  Bernhard Rinner,et al.  Cooperative Robots to Observe Moving Targets: Review , 2018, IEEE Transactions on Cybernetics.

[28]  Aníbal Ollero,et al.  Journal of Intelligent & Robotic Systems manuscript No. (will be inserted by the editor) An Unmanned Aircraft System for Automatic Forest Fire Monitoring and Measurement , 2022 .

[29]  Edwin K. P. Chong,et al.  Simultaneous Non-Myopic Optimization of UAV Guidance and Camera Gimbal Control for Target Tracking , 2018, 2018 IEEE Conference on Control Technology and Applications (CCTA).

[30]  D. Bird,et al.  Population Census of a Large Common Tern Colony with a Small Unmanned Aircraft , 2015, PloS one.