Expectation Propagation for Rectified Linear Poisson Regression

The Poisson likelihood with rectied linear function as non-linearity is a physically plausible model to discribe the stochastic arrival process of photons or other particles at a detector. At low emission rates the discrete nature of this process leads to measurement noise that behaves very dierently from additive white Gaussian noise. To address the intractable inference problem for such models, we present a novel ecient and robust Expectation Propagation algorithm entirely based on analytically tractable computations operating reliably in regimes where quadrature based implementations can fail. Full posterior inference therefore becomes an attractive alternative in areas generally dominated by methods of point estimation. Moreover, we discuss the rectied linear function in the context of other

[1]  Yoshua Bengio,et al.  Deep Sparse Rectifier Neural Networks , 2011, AISTATS.

[2]  Tom Heskes,et al.  Efficient Bayesian multivariate fMRI analysis using a sparsifying spatio-temporal prior , 2010, NeuroImage.

[3]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[4]  Ole Winther,et al.  Gaussian Processes for Classification: Mean-Field Algorithms , 2000, Neural Computation.

[5]  Jean-Luc Starck,et al.  Astronomical image and data analysis , 2002 .

[6]  Guillaume Bouchard,et al.  Fast Variational Bayesian Inference for Non-Conjugate Matrix Factorization Models , 2012, AISTATS.

[7]  Laure Blanc-Féraud,et al.  Sparse Poisson Noisy Image Deblurring , 2012, IEEE Transactions on Image Processing.

[8]  Ole Winther,et al.  Perturbative corrections for approximate inference in Gaussian latent variable models , 2013, J. Mach. Learn. Res..

[9]  Geoffrey E. Hinton,et al.  On rectified linear units for speech processing , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[10]  Christopher K. I. Williams,et al.  Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning) , 2005 .

[11]  Nico M. Temme,et al.  Computing the real parabolic cylinder functions U(a, x), V(a, x) , 2006, TOMS.

[12]  Aki Vehtari,et al.  GPstuff: Bayesian modeling with Gaussian processes , 2013, J. Mach. Learn. Res..

[13]  Mohammad Emtiyaz Khan,et al.  Variational Gaussian Inference for Bilinear Models of Count Data , 2014, ACML.

[14]  David Barber,et al.  Gaussian Kullback-Leibler approximate inference , 2013, J. Mach. Learn. Res..

[15]  William H. Richardson,et al.  Bayesian-Based Iterative Method of Image Restoration , 1972 .

[16]  J. Vanhatalo,et al.  Approximate inference for disease mapping with sparse Gaussian processes , 2010, Statistics in medicine.

[17]  Frédo Durand,et al.  Understanding Blind Deconvolution Algorithms , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Daniel Hernández-Lobato,et al.  Expectation propagation in linear regression models with spike-and-slab priors , 2015, Machine Learning.

[19]  Florian Steinke,et al.  Bayesian Inference and Optimal Design in the Sparse Linear Model , 2007, AISTATS.

[20]  Matthias Bethge,et al.  Bayesian Inference for Sparse Generalized Linear Models , 2007, ECML.

[21]  Mijung Park,et al.  Bayesian Active Learning of Neural Firing Rate Maps with Transformed Gaussian Process Priors , 2014, Neural Computation.

[22]  Matthias W. Seeger,et al.  Fast Convergent Algorithms for Expectation Propagation Approximate Bayesian Inference , 2010, AISTATS.

[23]  Carl E. Rasmussen,et al.  Gaussian Processes for Machine Learning (GPML) Toolbox , 2010, J. Mach. Learn. Res..

[24]  Matthias W. Seeger,et al.  Large Scale Bayesian Inference and Experimental Design for Sparse Linear Models , 2011, SIAM J. Imaging Sci..

[25]  L. Lucy An iterative technique for the rectification of observed distributions , 1974 .

[26]  Eero P. Simoncelli Modeling the joint statistics of images in the wavelet domain , 1999, Optics & Photonics.

[27]  M. J. Fadilia,et al.  DECONVOLUTION OF CONFOCAL MICROSCOPY IMAGES USING PROXIMAL ITERATION AND SPARSE REPRESENTATIONS , 2008 .

[28]  Manfred Opper,et al.  The Variational Gaussian Approximation Revisited , 2009, Neural Computation.

[29]  Matthias Bethge,et al.  Bayesian Inference for Spiking Neuron Models with a Sparsity Prior , 2007, NIPS.

[30]  Jonathan W. Pillow,et al.  Likelihood-based approaches to modeling the neural code , 2007 .

[31]  George Papandreou,et al.  Efficient variational inference in large-scale Bayesian compressed sensing , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[32]  Tom Minka,et al.  Expectation Propagation for approximate Bayesian inference , 2001, UAI.

[33]  Peter J. Diggle,et al.  Spatial and spatio-temporal Log-Gaussian Cox processes:extending the geostatistical paradigm , 2013, 1312.6536.

[34]  L. Paninski Maximum likelihood estimation of cascade point-process neural encoding models , 2004, Network.

[35]  P. McCullagh,et al.  Generalized Linear Models , 1984 .

[36]  Carl E. Rasmussen,et al.  Assessing Approximate Inference for Binary Gaussian Process Classification , 2005, J. Mach. Learn. Res..

[37]  R. Jarrett A note on the intervals between coal-mining disasters , 1979 .

[38]  Andrew L. Maas Rectifier Nonlinearities Improve Neural Network Acoustic Models , 2013 .

[39]  Mijung Park,et al.  Bayesian inference for low rank spatiotemporal neural receptive fields , 2013, NIPS.

[40]  C. Rasmussen,et al.  Approximations for Binary Gaussian Process Classification , 2008 .

[41]  Matthias Bethge,et al.  Bayesian Inference for Generalized Linear Models for Spiking Neurons , 2010, Front. Comput. Neurosci..