Sparse signal recovery under Poisson statistics

We are motivated by problems that arise in a number of applications such as explosives detection and online Marketing, where the observations are governed by Poisson statistics. Here each observation is a Poisson random variable whose mean is a sparse linear superposition of known patterns. Unlike many conventional problems observations here are not identically distributed since they are associated with different sensing modalities. We analyse the performance of a Maximum Likelihood (ML) decoder, which for our Poisson setting is computationally tractable. We derive fundamental sample complexity bounds for sparse recovery in the high-dimensional setting. We show that when the sensing matrix satisfies the so-called Restricted Eigenvalue (RE) condition the ℓ1 regularized ML decoder is consistent. Moreover, it converges exponentially fast in terms of number of observations. Our results apply to both deterministic and random sensing matrices and we present several results for both cases.

[1]  A E Bostwick,et al.  THE THEORY OF PROBABILITIES. , 1896, Science.

[2]  I. Rish,et al.  Sparse signal recovery with exponential-family noise , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[3]  Erik Wilde,et al.  Academic Search Engine Optimization (ASEO) , 2010 .

[4]  Shubhra Gangopadhyay,et al.  Detection of nitroaromatic explosives using a fluorescent-labeled imprinted polymer. , 2010, Analytical chemistry.

[5]  I. Rubin,et al.  Random point processes , 1977, Proceedings of the IEEE.

[6]  Roummel F. Marcia,et al.  Compressed Sensing Performance Bounds Under Poisson Noise , 2009, IEEE Transactions on Signal Processing.

[7]  Jöran Bela Erik Beel,et al.  Academic Search Engine Optimization (ASEO ): Optimizing Scholarly Literature for Google Scholar & Co. , 2010 .

[8]  Shuheng Zhou Restricted Eigenvalue Conditions on Subgaussian Random Matrices , 2009, 0912.4045.

[9]  Martin J. Wainwright,et al.  A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers , 2009, NIPS.

[10]  A. Robert Calderbank,et al.  Performance Bounds for Expander-Based Compressed Sensing in Poisson Noise , 2010, IEEE Transactions on Signal Processing.

[11]  Jean-Luc Starck,et al.  Deconvolution under Poisson noise using exact data fidelity and synthesis or analysis sparsity priors , 2011, 1103.2213.

[12]  Martin J. Wainwright,et al.  Restricted Eigenvalue Properties for Correlated Gaussian Designs , 2010, J. Mach. Learn. Res..

[13]  S. Geer,et al.  On the conditions used to prove oracle results for the Lasso , 2009, 0910.0722.

[14]  Ambuj Tewari,et al.  Learning Exponential Families in High-Dimensions: Strong Convexity and Sparsity , 2009, AISTATS.

[15]  Peter I. Frazier,et al.  Distance dependent Chinese restaurant processes , 2009, ICML.

[16]  W. Newey,et al.  Uniform Convergence in Probability and Stochastic Equicontinuity , 1991 .

[17]  Bin Yu,et al.  THE LASSO UNDER POISSON-LIKE HETEROSCEDASTICITY , 2013 .

[18]  P. Bickel,et al.  SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.

[19]  S. Portnoy Asymptotic Behavior of Likelihood Methods for Exponential Families when the Number of Parameters Tends to Infinity , 1988 .

[20]  B. Hoadley Asymptotic Properties of Maximum Likelihood Estimators for the Independent Not Identically Distributed Case , 1971 .

[21]  Shuheng Zhou,et al.  25th Annual Conference on Learning Theory Reconstruction from Anisotropic Random Measurements , 2022 .

[22]  Zachary T. Harmany,et al.  Sparse poisson intensity reconstruction algorithms , 2009, 2009 IEEE/SP 15th Workshop on Statistical Signal Processing.

[23]  Mohamed-Jalal Fadili,et al.  A Proximal Iteration for Deconvolving Poisson Noisy Images Using Sparse Representations , 2008, IEEE Transactions on Image Processing.