FicTrac: A visual method for tracking spherical motion and generating fictive animal paths

Studying how animals interface with a virtual reality can further our understanding of how attention, learning and memory, sensory processing, and navigation are handled by the brain, at both the neurophysiological and behavioural levels. To this end, we have developed a novel vision-based tracking system, FicTrac (Fictive path Tracking software), for estimating the path an animal makes whilst rotating an air-supported sphere using only input from a standard camera and computer vision techniques. We have found that the accuracy and robustness of FicTrac outperforms a low-cost implementation of a standard optical mouse-based approach for generating fictive paths. FicTrac is simple to implement for a wide variety of experimental configurations and, importantly, is fast to execute, enabling real-time sensory feedback for behaving animals. We have used FicTrac to record the behaviour of tethered honeybees, Apis mellifera, whilst presenting visual stimuli in both open-loop and closed-loop experimental paradigms. We found that FicTrac could accurately register the fictive paths of bees as they walked towards bright green vertical bars presented on an LED arena. Using FicTrac, we have demonstrated closed-loop visual fixation in both the honeybee and the fruit fly, Drosophila melanogaster, establishing the flexibility of this system. FicTrac provides the experimenter with a simple yet adaptable system that can be combined with electrophysiological recording techniques to study the neural mechanisms of behaviour in a variety of organisms, including walking vertebrates.

[1]  A. G. Greenhill Kinematics and Dynamics , 1888, Nature.

[2]  Song Wang,et al.  Toward Refocused Optical Mouse Sensors for Outdoor Optical Flow Odometry , 2012, IEEE Sensors Journal.

[3]  Andrew C. Mason,et al.  Hyperacute directional hearing in a microscale auditory system , 2001, Nature.

[4]  M. Heisenberg,et al.  Vision in Drosophila: Genetics of Microbehavior , 2011 .

[5]  J. Blondeau,et al.  Electrically Evoked Course Control in the Fly Calliphora Erythrocephala , 1981 .

[6]  Umberto Minoni,et al.  Low-cost optical motion sensors: An experimental characterization , 2006 .

[7]  Michael H. Dickinson,et al.  Multi-camera real-time three-dimensional tracking of multiple flying animals , 2010, Journal of The Royal Society Interface.

[8]  A Schnee,et al.  Rats are able to navigate in virtual environments , 2005, Journal of Experimental Biology.

[9]  Karl Georg Götz,et al.  Zum Bewegungssehen des Mehlkäfers Tenebrio molitor , 1968, Kybernetik.

[10]  Allen Cheung,et al.  Animal navigation: the difficulty of moving in a straight line , 2007, Biological Cybernetics.

[11]  J A Doherty,et al.  A new microcomputer-based method for measuring walking phonotaxis in field crickets (Gryllidae). , 1987, The Journal of experimental biology.

[12]  Michael H. Dickinson,et al.  A modular display system for insect behavioral neuroscience , 2008, Journal of Neuroscience Methods.

[13]  Kyong Sei Lee,et al.  Design and analysis of an absolute non-contact orientation sensor for wrist motion control , 2001, 2001 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Proceedings (Cat. No.01TH8556).

[14]  C. M. Comer,et al.  A motion tracking system for simultaneous recording of rapid locomotion and neural activity from an insect , 1995, Journal of Neuroscience Methods.

[15]  E. Kramer,et al.  The orientation of walking honeybees in odour fields with small concentration gradients , 1976 .

[16]  M. Heisenberg,et al.  Vision in Drosophila , 1984 .

[17]  Gus K. Lott,et al.  An inexpensive sub-millisecond system for walking measurements of small animals based on optical computer mouse technology , 2007, Journal of Neuroscience Methods.

[18]  G. Chirikjian,et al.  Mathematical models of binary spherical-motion encoders , 2003 .

[19]  Mikko Vähäsöyrinki,et al.  A fast and flexible panoramic virtual reality system for behavioural and electrophysiological experiments , 2012, Scientific Reports.

[20]  R. B. Pinter,et al.  What causes edge fixation in walking flies? , 1990, The Journal of experimental biology.

[21]  Ralph L. Hollis,et al.  Development of a three-dimensional ball rotation sensing system using optical mouse sensors , 2011, 2011 IEEE International Conference on Robotics and Automation.

[22]  H. Dahmen,et al.  A simple apparatus to investigate the orientation of walking insects , 1980, Experientia.

[23]  J. Canny,et al.  Nonholonomic Motion Planning , 1992 .

[24]  R. Hengstenberg,et al.  The halteres of the blowfly Calliphora , 1994, Journal of Comparative Physiology A.

[25]  Mandyam V Srinivasan,et al.  Honeybees as a model for the study of visually guided flight, navigation, and biologically inspired robotics. , 2011, Physiological reviews.

[26]  J. P. Lindemann,et al.  FliMax, a novel stimulus device for panoramic and highspeed presentation of behaviourally generated optic flow , 2003, Vision Research.

[27]  Karl Georg Götz,et al.  Visual control of locomotion in the walking fruitflyDrosophila , 1973, Journal of comparative physiology.

[28]  Guoli Li,et al.  Vision based orientation detection method and control of a spherical motor , 2010, 2010 53rd IEEE International Midwest Symposium on Circuits and Systems.

[29]  Karl A. Stol,et al.  Characterisation of Low-cost Optical Flow Sensors , 2010 .

[30]  Damon A. Clark,et al.  Defining the Computational Structure of the Motion Detector in Drosophila , 2011, Neuron.

[31]  Ralph L. Hollis,et al.  A dynamically stable single-wheeled mobile robot with inverse mouse-ball drive , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[32]  Jordi Palacín,et al.  The optical mouse for indoor mobile robot odometry measurement , 2006 .

[33]  Berthold Hedwig,et al.  Complex auditory behaviour emerges from simple reactive steering , 2004, Nature.

[34]  Zhefeng Gong,et al.  Edge detection depends on achromatic channel in Drosophila melanogaster , 2012, Journal of Experimental Biology.

[35]  Rosemary Norman,et al.  Investigations into the effects of illumination and acceleration on optical mouse sensors as contact-free 2D measurement devices , 2009 .

[36]  M. Powell The BOBYQA algorithm for bound constrained optimization without derivatives , 2009 .

[37]  Andrew D. Straw,et al.  Vision Egg: an Open-Source Library for Realtime Visual Stimulus Generation , 2008, Frontiers Neuroinformatics.

[38]  Alexander Borst,et al.  Object tracking in motion-blind flies , 2013, Nature Neuroscience.

[39]  Michael B. Reiser,et al.  Two-photon calcium imaging from motion-sensitive neurons in head-fixed Drosophila during optomotor walking behavior , 2010, Nature Methods.

[40]  Allen Cheung,et al.  Honeybee flight: a novel ‘streamlining’ response , 2011, Journal of Experimental Biology.

[41]  Kok-Meng Lee,et al.  A real-time optical sensor for simultaneous measurement of three-DOF motions , 2004, IEEE/ASME Transactions on Mechatronics.

[42]  J. H. van Hateren,et al.  Using miniature sensor coils for simultaneous measurement of orientation and position of small, fast-moving animals , 1998, Journal of Neuroscience Methods.