Minimax Optimal Sparse Signal Recovery With Poisson Statistics

We are motivated by problems that arise in a number of applications such as Online Marketing and Explosives detection, where the observations are usually modeled using Poisson statistics. We model each observation as a Poisson random variable whose mean is a sparse linear superposition of known patterns. Unlike many conventional problems observations here are not identically distributed since they are associated with different sensing modalities. We analyze the performance of a maximum likelihood (ML) decoder, which for our Poisson setting involves a non-linear optimization but yet is computationally tractable. We derive fundamental sample complexity bounds for sparse recovery when the measurements are contaminated with Poisson noise. In contrast to the least-squares linear regression setting with Gaussian noise, we observe that in addition to sparsity, the scale of the parameters also fundamentally impacts l2 error in the Poisson setting. We show that our upper bounds are tight under suitable regularity conditions. Specifically, we derive a minimax matching lower bound on the mean-squared error and show that our constrained ML decoder is minimax optimal for this regime.

[1]  Peter I. Frazier,et al.  Distance dependent Chinese restaurant processes , 2009, ICML.

[2]  Bin Yu,et al.  THE LASSO UNDER POISSON-LIKE HETEROSCEDASTICITY , 2013 .

[3]  P. Bickel,et al.  SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.

[4]  S. Geer,et al.  On the conditions used to prove oracle results for the Lasso , 2009, 0910.0722.

[5]  Michael S. Gashler,et al.  Waffles: A Machine Learning Toolkit , 2011, J. Mach. Learn. Res..

[6]  Jöran Bela Erik Beel,et al.  Academic Search Engine Optimization (ASEO ): Optimizing Scholarly Literature for Google Scholar & Co. , 2010 .

[7]  Roummel F. Marcia,et al.  Compressed Sensing Performance Bounds Under Poisson Noise , 2009, IEEE Transactions on Signal Processing.

[8]  Venkatesh Saligrama,et al.  Information Theoretic Bounds for Compressed Sensing , 2008, IEEE Transactions on Information Theory.

[9]  Shuheng Zhou Restricted Eigenvalue Conditions on Subgaussian Random Matrices , 2009, 0912.4045.

[10]  Venkatesh Saligrama,et al.  Sparse signal recovery under poisson statistics for online marketing applications , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[11]  Bin Yu,et al.  The Lasso under Heteroscedasticity , 2010, 1011.1026.

[12]  Aditya Guntuboyina Lower Bounds for the Minimax Risk Using $f$-Divergences, and Applications , 2011, IEEE Transactions on Information Theory.

[13]  Bin Yu Assouad, Fano, and Le Cam , 1997 .

[14]  Ambuj Tewari,et al.  Learning Exponential Families in High-Dimensions: Strong Convexity and Sparsity , 2009, AISTATS.

[15]  Shubhra Gangopadhyay,et al.  Detection of nitroaromatic explosives using a fluorescent-labeled imprinted polymer. , 2010, Analytical chemistry.

[16]  S. Portnoy Asymptotic Behavior of Likelihood Methods for Exponential Families when the Number of Parameters Tends to Infinity , 1988 .

[17]  Venkatesh Saligrama,et al.  Sparse signal recovery under Poisson statistics , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[18]  Shuheng Zhou,et al.  25th Annual Conference on Learning Theory Reconstruction from Anisotropic Random Measurements , 2022 .

[19]  I. Rish,et al.  Sparse signal recovery with exponential-family noise , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[20]  Martin J. Wainwright,et al.  A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers , 2009, NIPS.

[21]  A. Robert Calderbank,et al.  Performance Bounds for Expander-Based Compressed Sensing in Poisson Noise , 2010, IEEE Transactions on Signal Processing.