Bayesian Neural Decoding Using A Diversity-Encouraging Latent Representation Learning Method

Author(s): Chen, Tian; Li, Lingge; Elias, Gabriel; Fortin, Norbert; Shahbaba, Babak | Abstract: It is well established that temporal organization is critical to memory, and that the ability to temporally organize information is fundamental to many perceptual, cognitive, and motor processes. While our understanding of how the brain processes the spatial context of memories has advanced considerably, our understanding of their temporal organization lags far behind. In this paper, we propose a new approach for elucidating the neural basis of complex behaviors and temporal organization of memories. More specifically, we focus on neural decoding - the prediction of behavioral or experimental conditions based on observed neural data. In general, this is a challenging classification problem, which is of immense interest in neuroscience. Our goal is to develop a new framework that not only improves the overall accuracy of decoding, but also provides a clear latent representation of the decoding process. To accomplish this, our approach uses a Variational Auto-encoder (VAE) model with a diversity-encouraging prior based on determinantal point processes (DPP) to improve latent representation learning by avoiding redundancy in the latent space. We apply our method to data collected from a novel rat experiment that involves presenting repeated sequences of odors at a single port and testing the rats' ability to identify each odor. We show that our method leads to substantially higher accuracy rate for neural decoding and allows to discover novel biological phenomena by providing a clear latent representation of the decoding process.

[1]  Ben Taskar,et al.  Determinantal Point Processes for Machine Learning , 2012, Found. Trends Mach. Learn..

[2]  B. McNaughton,et al.  Replay of Neuronal Firing Sequences in Rat Hippocampus During Sleep Following Spatial Experience , 1996, Science.

[3]  Charles Elkan,et al.  The Foundations of Cost-Sensitive Learning , 2001, IJCAI.

[4]  U. Mitzdorf Current source-density method and application in cat cerebral cortex: investigation of evoked potentials and EEG phenomena. , 1985, Physiological reviews.

[5]  Haibo He,et al.  Learning from Imbalanced Data , 2009, IEEE Transactions on Knowledge and Data Engineering.

[6]  Robert C. Holte,et al.  Concept Learning and the Problem of Small Disjuncts , 1989, IJCAI.

[7]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[8]  M. Quirk,et al.  Experience-Dependent Asymmetric Shape of Hippocampal Receptive Fields , 2000, Neuron.

[9]  G. Buzsáki,et al.  Temporal Encoding of Place Sequences by Hippocampal Cell Assemblies , 2006, Neuron.

[10]  David J. Foster,et al.  Reverse replay of behavioural sequences in hippocampal place cells during the awake state , 2006, Nature.

[11]  Timothy A Allen,et al.  Memory for sequences of events impaired in typical aging , 2015, Learning & memory.

[12]  Anoopum S. Gupta,et al.  Segmentation of spatial experience by hippocampal theta sequences , 2012, Nature Neuroscience.

[13]  Ben Taskar,et al.  Learning the Parameters of Determinantal Point Process Kernels , 2014, ICML.

[14]  David Mease,et al.  Boosted Classification Trees and Class Probability/Quantile Estimation , 2007, J. Mach. Learn. Res..

[15]  Michael J. McCourt,et al.  Stable Evaluation of Gaussian Radial Basis Function Interpolants , 2012, SIAM J. Sci. Comput..

[16]  H. Eichenbaum Time cells in the hippocampus: a new dimension for mapping memories , 2014, Nature Reviews Neuroscience.

[17]  N. Fortin,et al.  A Sequence of events model of episodic memory shows parallels in rats and humans , 2014, Hippocampus.

[18]  N. Fortin,et al.  Nonspatial Sequence Coding in CA1 Neurons , 2016, The Journal of Neuroscience.