Spatial cognition in a virtual reality home-cage extension for freely moving rodents.

Virtual reality (VR) environments are a powerful tool to investigate brain mechanisms involved in the behavior of animals. With this technique, animals are usually head fixed or secured in a harness, and training for cognitively more complex VR paradigms is time consuming. A VR apparatus allowing free animal movement and the constant operator-independent training of tasks would enable many new applications. Key prospective usages include brain imaging of animal behavior when carrying a miniaturized mobile device such as a fluorescence microscope or an optetrode. Here, we introduce the Servoball, a spherical VR treadmill based on the closed-loop tracking of a freely moving animal and feedback counterrotation of the ball. Furthermore, we present the complete integration of this experimental system with the animals' group home cage, from which single individuals can voluntarily enter through a tunnel with radio-frequency identification (RFID)-automated access control and commence experiments. This automated animal sorter functions as a mechanical replacement of the experimenter. We automatically trained rats using visual or acoustic cues to solve spatial cognitive tasks and recorded spatially modulated entorhinal cells. When electrophysiological extracellular recordings from awake behaving rats were performed, head fixation can dramatically alter results, so that any complex behavior that requires head movement is impossible to achieve. We circumvented this problem with the use of the Servoball in open-field scenarios, as it allows the combination of open-field behavior with the recording of nerve cells, along with all the flexibility that a virtual environment brings. This integrated home cage with a VR arena experimental system permits highly efficient experimentation for complex cognitive experiments.NEW & NOTEWORTHY Virtual reality (VR) environments are a powerful tool for the investigation of brain mechanisms. We introduce the Servoball, a VR treadmill for freely moving rodents. The Servoball is integrated with the animals' group home cage. Single individuals voluntarily enter using automated access control. Training is highly time-efficient, even for cognitively complex VR paradigms.

[1]  B. Strowbridge,et al.  Visual landmarks facilitate rodent spatial navigation in virtual reality environments. , 2012, Learning & memory.

[2]  E. Kramer,et al.  Orientation of the Male Silkmoth to the Sex Attractant Bombykol , 1975 .

[3]  Jeffrey S. Taube,et al.  Is Navigation in Virtual Reality with fMRI Really Navigation? , 2013, Journal of Cognitive Neuroscience.

[4]  Lacey J. Kitch,et al.  Long-term dynamics of CA1 hippocampal place codes , 2013, Nature Neuroscience.

[5]  Joachim Hermann,et al.  Mongolian gerbils learn to navigate in complex virtual spaces , 2014, Behavioural Brain Research.

[6]  A Schnee,et al.  Rats are able to navigate in virtual environments , 2005, Journal of Experimental Biology.

[7]  Michael B. Reiser,et al.  Real neuroscience in virtual worlds , 2012, Current Opinion in Neurobiology.

[8]  Mayank R Mehta,et al.  Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality , 2014, Nature Neuroscience.

[9]  Corey J. Bohil,et al.  Virtual reality in neuroscience research and therapy , 2011, Nature Reviews Neuroscience.

[10]  Christopher D. Harvey,et al.  Choice-specific sequences in parietal cortex during a virtual-navigation decision task , 2012, Nature.

[11]  A. Gamal,et al.  Miniaturized integration of a fluorescence microscope , 2011, Nature Methods.

[12]  K. Deisseroth,et al.  Millisecond-timescale, genetically targeted optical control of neural activity , 2005, Nature Neuroscience.

[13]  M. Rivalan,et al.  Correction: An Automated, Experimenter-Free Method for the Standardised, Operant Cognitive Testing of Rats , 2017, PloS one.

[14]  Thomas Wachtler,et al.  Contextual processing of brightness and color in Mongolian gerbils. , 2015, Journal of vision.

[15]  Andrea T. U. Schaefers,et al.  A sorting system with automated gates permits individual operant experiments with mice from a social home cage , 2011, Journal of Neuroscience Methods.

[16]  Mayank R. Mehta,et al.  Multisensory Control of Multimodal Behavior: Do the Legs Know What the Tongue Is Doing? , 2013, PloS one.

[17]  Mayank R. Mehta,et al.  Multisensory Control of Hippocampal Spatiotemporal Selectivity , 2013, Science.

[18]  Karl Deisseroth,et al.  Optetrode: a multichannel readout for optogenetic control in freely moving mice , 2011, Nature Neuroscience.

[19]  K. L. Montgomery,et al.  Virally mediated optogenetic excitation and inhibition of pain in freely moving non-transgenic mice , 2014, Nature Biotechnology.

[20]  Feng Zhang,et al.  Channelrhodopsin-2 and optical control of excitable cells , 2006, Nature Methods.

[21]  Dmitriy Aronov,et al.  Engagement of Neural Circuits Underlying 2D Spatial Navigation in a Rodent Virtual Reality System , 2014, Neuron.

[22]  Georg B. Keller,et al.  Sensorimotor Mismatch Signals in Primary Visual Cortex of the Behaving Mouse , 2012, Neuron.

[23]  D. Tank,et al.  Membrane potential dynamics of grid cells , 2013, Nature.

[24]  Elodie Tiran,et al.  EEG and functional ultrasound imaging in mobile rats , 2015, Nature Methods.

[25]  M. Häusser,et al.  Cellular mechanisms of spatial navigation in the medial entorhinal cortex , 2013, Nature Neuroscience.

[26]  Kay Thurley,et al.  Estimation of self-motion duration and distance in rodents , 2016, Royal Society Open Science.

[27]  D. Tank,et al.  Intracellular dynamics of hippocampal place cells during virtual navigation , 2009, Nature.

[28]  E. Kramer,et al.  The orientation of walking honeybees in odour fields with small concentration gradients , 1976 .

[29]  James G. Heys,et al.  The Functional Micro-organization of Grid Cells Revealed by Cellular-Resolution Imaging , 2014, Neuron.

[30]  Kay Thurley,et al.  Virtual reality systems for rodents , 2016, Current zoology.