Bayesian Active Learning of Neural Firing Rate Maps with Transformed Gaussian Process Priors

A firing rate map, also known as a tuning curve, describes the nonlinear relationship between a neuron's spike rate and a low-dimensional stimulus (e.g., orientation, head direction, contrast, color). Here we investigate Bayesian active learning methods for estimating firing rate maps in closed-loop neurophysiology experiments. These methods can accelerate the characterization of such maps through the intelligent, adaptive selection of stimuli. Specifically, we explore the manner in which the prior and utility function used in Bayesian active learning affect stimulus selection and performance. Our approach relies on a flexible model that involves a nonlinearly transformed gaussian process (GP) prior over maps and conditionally Poisson spiking. We show that infomax learning, which selects stimuli to maximize the information gain about the firing rate map, exhibits strong dependence on the seemingly innocuous choice of nonlinear transformation function. We derive an alternate utility function that selects stimuli to minimize the average posterior variance of the firing rate map and analyze the surprising relationship between prior parameterization, stimulus selection, and active learning performance in GP-Poisson models. We apply these methods to color tuning measurements of neurons in macaque primary visual cortex.

[1]  Liam Paninski,et al.  Asymptotic Theory of Information-Theoretic Experimental Design , 2005, Neural Computation.

[2]  R. Shapley,et al.  Orientation Selectivity in Macaque V1: Diversity and Laminar Dependence , 2002, The Journal of Neuroscience.

[3]  Philipp Hennig,et al.  Entropy Search for Information-Efficient Global Optimization , 2011, J. Mach. Learn. Res..

[4]  K. Chaloner,et al.  Bayesian Experimental Design: A Review , 1995 .

[5]  Kamiar Rahnama Rad,et al.  Efficient, adaptive estimation of two-dimensional firing rate surfaces via Gaussian process methods , 2010, Network.

[6]  U. Kamil,et al.  Functional imaging , 2001, 2001 Conference Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[7]  D. Lindley On a Measure of the Information Provided by an Experiment , 1956 .

[8]  David J. C. MacKay,et al.  Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.

[9]  Andrew McCallum,et al.  Toward Optimal Active Learning through Sampling Estimation of Error Reduction , 2001, ICML.

[10]  Eric R. Ziegel,et al.  Generalized Linear Models , 2002, Technometrics.

[11]  L. Lenchik Functional imaging , 2007, Annals of Biomedical Engineering.

[12]  Mijung Park,et al.  Active learning of neural response functions with Gaussian processes , 2011, NIPS.

[13]  G. Horwitz,et al.  Nonlinear analysis of macaque V1 color tuning reveals cardinal directions for cortical color processing , 2012, Nature Neuroscience.

[14]  K. Chaloner,et al.  Bayesian analysis in statistics and econometrics : essays in honor of Arnold Zellner , 1996 .

[15]  P. McCullagh,et al.  Generalized Linear Models, 2nd Edn. , 1990 .

[16]  Tim Gollisch,et al.  Closed-Loop Measurements of Iso-Response Stimuli Reveal Dynamic Nonlinear Stimulus Integration in the Retina , 2012, Neuron.

[17]  T Bonhoeffer,et al.  Orientation selectivity in pinwheel centers in cat striate cortex. , 1997, Science.

[18]  John P. Cunningham,et al.  Fast Gaussian process methods for point process intensity estimation , 2008, ICML '08.

[19]  R. W. Wedderburn,et al.  On the existence and uniqueness of the maximum likelihood estimates for certain generalized linear models , 1976 .

[20]  Sooyoung Chung,et al.  Functional imaging with cellular resolution reveals precise micro-architecture in visual cortex , 2005, Nature.

[21]  Andreas Krause,et al.  Near-Optimal Sensor Placements in Gaussian Processes: Theory, Efficient Algorithms and Empirical Studies , 2008, J. Mach. Learn. Res..

[22]  L. Paninski Maximum likelihood estimation of cascade point-process neural encoding models , 2004, Network.

[23]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[24]  Prem K. Goel,et al.  Bayesian Analysis in Statistics and Econometrics , 1992 .

[25]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[26]  Arnaud Doucet,et al.  SMC Samplers for Bayesian Optimal Nonlinear Design , 2006, 2006 IEEE Nonlinear Statistical Signal Processing Workshop.

[27]  S. Luttrell The use of transinformation in the design of data sampling schemes for inverse problems , 1985 .

[28]  Ryan P. Adams,et al.  Tractable nonparametric Bayesian inference in Poisson processes with Gaussian process intensities , 2009, ICML '09.

[29]  David A. Cohn,et al.  Active Learning with Statistical Models , 1996, NIPS.

[30]  Eric Horvitz,et al.  Selective Supervision: Guiding Supervised Learning with Decision-Theoretic Active Learning , 2007, IJCAI.

[31]  D. Hubel,et al.  Receptive fields of single neurones in the cat's striate cortex , 1959, The Journal of physiology.

[32]  J. Bernardo Expected Information as Expected Utility , 1979 .

[33]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[34]  Kathryn Chaloner,et al.  em Bayesian analysis in statistics and econometrics , 1997 .

[35]  A. P. Georgopoulos,et al.  Neuronal population coding of movement direction. , 1986, Science.

[36]  Christian K. Machens,et al.  Testing the Efficiency of Sensory Coding with Optimal Stimulus Ensembles , 2005, Neuron.

[37]  Robert J. Butera,et al.  Sequential Optimal Design of Neurophysiology Experiments , 2009, Neural Computation.

[38]  Tim Gollisch,et al.  From response to stimulus: adaptive sampling in sensory physiology , 2007, Current Opinion in Neurobiology.

[39]  M. Carandini,et al.  Membrane Potential and Firing Rate in Cat Primary Visual Cortex , 2000, The Journal of Neuroscience.