Active learning of neural response functions with Gaussian processes

A sizeable literature has focused on the problem of estimating a low-dimensional feature space for a neuron's stimulus sensitivity. However, comparatively little work has addressed the problem of estimating the nonlinear function from feature space to spike rate. Here, we use a Gaussian process (GP) prior over the infinite-dimensional space of nonlinear functions to obtain Bayesian estimates of the "non-linearity" in the linear-nonlinear-Poisson (LNP) encoding model. This approach offers increased flexibility, robustness, and computational tractability compared to traditional methods (e.g., parametric forms, histograms, cubic splines). We then develop a framework for optimal experimental design under the GP-Poisson model using uncertainty sampling. This involves adaptively selecting stimuli according to an information-theoretic criterion, with the goal of characterizing the nonlinearity with as little experimental data as possible. Our framework relies on a method for rapidly updating hyperparameters under a Gaussian approximation to the posterior. We apply these methods to neural data from a color-tuned simple cell in macaque V1, characterizing its nonlinear response function in the 3D space of cone contrasts. We find that it combines cone inputs in a highly nonlinear manner. With simulated experiments, we show that optimal design substantially reduces the amount of data required to estimate these nonlinear combination rules.

[1]  Gerard Salton,et al.  Research and Development in Information Retrieval , 1982, Lecture Notes in Computer Science.

[2]  E.E. Pissaloux,et al.  Image Processing , 1994, Proceedings. Second Euromicro Workshop on Parallel and Distributed Processing.

[3]  K. Pearson,et al.  Biometrika , 1902, The American Naturalist.

[4]  Wei Wu,et al.  Bayesian Population Decoding of Motor Cortical Activity Using a Kalman Filter , 2006, Neural Computation.

[5]  Tom Minka,et al.  Expectation Propagation for approximate Bayesian inference , 2001, UAI.

[6]  Taylor Francis Online,et al.  The American statistician , 1947 .

[7]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[8]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[9]  H E M Journal of Neurophysiology , 1938, Nature.

[10]  Zoubin Ghahramani,et al.  Sparse Gaussian Processes using Pseudo-inputs , 2005, NIPS.

[11]  D. Wilkin,et al.  Neuron , 2001, Brain Research.

[12]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[13]  William A. Gale,et al.  A sequential algorithm for training text classifiers , 1994, SIGIR '94.

[14]  G. Gurtner,et al.  Statistics in medicine. , 2011, Plastic and reconstructive surgery.

[15]  Liam Paninski,et al.  Efficient Markov Chain Monte Carlo Methods for Decoding Neural Spike Trains , 2011, Neural Computation.

[16]  Robert J. Butera,et al.  Sequential Optimal Design of Neurophysiology Experiments , 2009, Neural Computation.

[17]  William Bialek,et al.  Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions , 2002, Neural Computation.

[18]  Todd P. Coleman,et al.  A Computationally Efficient Method for Nonparametric Modeling of Neural Spiking Activity with Point Processes , 2010, Neural Computation.

[19]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[20]  E. Bizzi,et al.  The Cognitive Neurosciences , 1996 .

[21]  Amos Storkey UAI '00: Proceedings of the 16th Conference in Uncertainty in Artificial Intelligence , 2000 .

[22]  Andreas Krause,et al.  Near-Optimal Sensor Placements in Gaussian Processes: Theory, Efficient Algorithms and Empirical Studies , 2008, J. Mach. Learn. Res..

[23]  Liam Paninski,et al.  Model-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike Trains , 2011, Neural Computation.

[24]  John P. Cunningham,et al.  Fast Gaussian process methods for point process intensity estimation , 2008, ICML '08.

[25]  Michael L. Littman,et al.  Proceedings of the 26th Annual International Conference on Machine Learning , 2009, ICML 2009.

[26]  David J. C. MacKay,et al.  Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.

[27]  Emery N. Brown,et al.  Estimating a State-space Model from Point Process Observations Emery N. Brown , 2022 .

[28]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[29]  Thomas S. Huang,et al.  Image processing , 1971 .