Performance of a Visual Fixation Model in an Autonomous Micro Robot Inspired by Drosophila Physiology

In nature, lightweight and low-powered insects are ideal model systems to study motion perception strategies. Understanding the underlying characteristics and functionality of insects' visual systems is not only attractive to neural system modellers but also critical in providing effective solutions to future robotics. This paper presents a novel modelling of dynamic vision system inspired by Drosophila physiology for mimicking fast motion tracking and a closed-loop behavioural response to fixation. The proposed model was realised on the embedded system in an autonomous micro robot which has limited computational resources. A monocular camera was applied as the only motion sensing modality. Systematic experiments including open-loop and closed-loop bio-robotic tests validated the proposed visual fixation model: the robot showed motion tracking and fixation behaviours similarly to insects; the image processing frequency can maintain 25 ~ 45Hz. Arena tests also demonstrated a successful following behaviour aroused by fixation in navigation.

[1]  Mark A. Frye,et al.  Figure Tracking by Flies Is Supported by Parallel Visual Streams , 2012, Current Biology.

[2]  Qinbing Fu,et al.  Mimicking fly motion tracking and fixation behaviors with a hybrid visual neural network , 2017, 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[3]  Silvio Savarese,et al.  Learning to Track at 100 FPS with Deep Regression Networks , 2016, ECCV.

[4]  Ali Farhadi,et al.  You Only Look Once: Unified, Real-Time Object Detection , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Adrian Hilton,et al.  A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..

[6]  Davide Scaramuzza,et al.  Multi-body Motion Estimation from Monocular Vehicle-Mounted Cameras , 2016, IEEE Transactions on Robotics.

[7]  Qinbing Fu,et al.  Modeling direction selective visual neural network with ON and OFF pathways for extracting motion cues from cluttered background , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).

[8]  Qinbing Fu,et al.  Collision selective LGMDs neuron models research benefits from a vision-based autonomous micro robot , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[9]  Alexander Borst,et al.  Object tracking in motion-blind flies , 2013, Nature Neuroscience.

[10]  Farshad Arvin,et al.  Bio-Inspired Embedded Vision System for Autonomous Micro-Robots: The LGMD Case , 2017, IEEE Transactions on Cognitive and Developmental Systems.

[11]  Steven Grainger,et al.  An autonomous robot inspired by insect neurophysiology pursues moving features in natural environments , 2017, Journal of neural engineering.

[12]  Nuno Vasconcelos,et al.  Biologically Inspired Object Tracking Using Center-Surround Saliency Mechanisms , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Qinbing Fu,et al.  Shaping the collision selectivity in a looming sensitive neuron model with parallel ON and OFF pathways and spike frequency adaptation , 2018, Neural Networks.

[14]  Steven Grainger,et al.  Performance of an insect-inspired target tracker in natural conditions , 2017, Bioinspiration & biomimetics.

[15]  F. Ruffier,et al.  Optic flow-based collision-free strategies: From insects to robots. , 2017, Arthropod structure & development.

[16]  Thomas R. Clandinin,et al.  A Class of Visual Neurons with Wide-Field Properties Is Required for Local Motion Detection , 2015, Current Biology.

[17]  Thomas Brox,et al.  FlowNet: Learning Optical Flow with Convolutional Networks , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).