Density Corrected Sparse Recovery when R.I.P. Condition Is Broken

The Restricted Isometric Property (R.I.P.) is a very important condition for recovering sparse vectors from high dimensional space. Traditional methods often rely on R.I.P or its relaxed variants. However, in real applications, features are often correlated to each other, which makes these assumptions too strong to be useful. In this paper, we study the sparse recovery problem in which the feature matrix is strictly non-R.I.P. We prove that when features exhibit cluster structures, which often happens in real applications, we are able to recover the sparse vector consistently. The consistency comes from our proposed density correction algorithm, which removes the variance of estimated cluster centers using cluster density. The proposed algorithm converges geometrically, achieves nearly optimal recovery bound O(s2 log(d)) where s is the sparsity and d is the nominal dimension.

[1]  Rong Jin,et al.  A New Analysis of Compressive Sensing by Stochastic Proximal Gradient Descent , 2013, ArXiv.

[2]  Shie Mannor,et al.  Robust Sparse Regression under Adversarial Corruption , 2013, ICML.

[3]  N. Lass Contemporary Issues in Experimental Phonetics , 1976 .

[4]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[5]  Mike E. Davies,et al.  Iterative Hard Thresholding for Compressed Sensing , 2008, ArXiv.

[6]  Jeff Irion,et al.  Applied and computational harmonic analysis on graphs and networks , 2015, SPIE Optical Engineering + Applications.

[7]  Axthonv G. Oettinger,et al.  IEEE Transactions on Information Theory , 1998 .

[8]  Sabine Van Huffel,et al.  Overview of total least-squares methods , 2007, Signal Process..

[9]  J CandesEmmanuel,et al.  A Probabilistic and RIPless Theory of Compressed Sensing , 2011 .

[10]  S. Geer,et al.  On the conditions used to prove oracle results for the Lasso , 2009, 0910.0722.

[11]  Zhaoran Wang,et al.  OPTIMAL COMPUTATIONAL AND STATISTICAL RATES OF CONVERGENCE FOR SPARSE NONCONVEX LEARNING PROBLEMS. , 2013, Annals of statistics.

[12]  David G. Lowe,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004, International Journal of Computer Vision.

[13]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[14]  T. Blumensath,et al.  Iterative Thresholding for Sparse Approximations , 2008 .

[15]  Tong Zhang Multi-stage Convex Relaxation for Feature Selection , 2011, 1106.0565.

[16]  Saeed Ghadimi,et al.  Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms , 2013, SIAM J. Optim..

[17]  Arieh Iserles,et al.  On the Foundations of Computational Mathematics , 2001 .

[18]  Shie Mannor,et al.  Robust Regression and Lasso , 2008, IEEE Transactions on Information Theory.

[19]  Jieping Ye,et al.  Forward-Backward Greedy Algorithms for General Convex Smooth Functions over A Cardinality Constraint , 2013, ICML.

[20]  Po-Ling Loh,et al.  High-dimensional regression with noisy and missing data: Provable guarantees with non-convexity , 2011, NIPS.

[21]  S. Foucart Sparse Recovery Algorithms: Sufficient Conditions in Terms of RestrictedIsometry Constants , 2012 .

[22]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[23]  Martin J. Wainwright,et al.  Stochastic optimization and sparse statistical recovery: An optimal algorithm for high dimensions , 2012, 2014 48th Annual Conference on Information Sciences and Systems (CISS).

[24]  Po-Ling Loh,et al.  Regularized M-estimators with nonconvexity: statistical and algorithmic theory for local optima , 2013, J. Mach. Learn. Res..

[25]  Simon Foucart,et al.  Hard Thresholding Pursuit: An Algorithm for Compressive Sensing , 2011, SIAM J. Numer. Anal..

[26]  D. Colella Journal of Fourier Analysis and Applications , 2017 .

[27]  Yi Yang,et al.  DevNet: A Deep Event Network for multimedia event detection and evidence recounting , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[28]  Joel A. Tropp,et al.  Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit , 2007, IEEE Transactions on Information Theory.

[29]  Xiao-Tong Yuan,et al.  Gradient Hard Thresholding Pursuit for Sparsity-Constrained Optimization , 2013, ICML.

[30]  Jieping Ye,et al.  Efficient nonconvex sparse group feature selection via continuous and discrete optimization , 2015, Artif. Intell..

[31]  Thomas Blumensath,et al.  Accelerated iterative hard thresholding , 2012, Signal Process..

[32]  Lin Xiao,et al.  An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization , 2014, Computational Optimization and Applications.

[33]  Yi Yang,et al.  Exploring Semantic Inter-Class Relationships (SIR) for Zero-Shot Action Recognition , 2015, AAAI.

[34]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[35]  Xin-She Yang,et al.  Computational Optimization and Applications in Engineering and Industry , 2013, Computational Optimization and Applications in Engineering and Industry.

[36]  Jieping Ye,et al.  A General Iterative Shrinkage and Thresholding Algorithm for Non-convex Regularized Optimization Problems , 2013, ICML.

[37]  Rong Jin,et al.  Efficient Sparse Recovery via Adaptive Non-Convex Regularizers with Oracle Property , 2014, UAI.

[38]  P. Laguna,et al.  Signal Processing , 2002, Yearbook of Medical Informatics.

[39]  Xi Chen,et al.  A sparsity preserving stochastic gradient methods for sparse regression , 2014, Computational Optimization and Applications.

[40]  Jieping Ye,et al.  Efficient Sparse Group Feature Selection via Nonconvex Optimization , 2012, ICML.

[41]  Ambuj Tewari,et al.  Stochastic methods for l1 regularized loss minimization , 2009, ICML '09.

[42]  Armando Manduca,et al.  Relaxed Conditions for Sparse Signal Recovery With General Concave Priors , 2009, IEEE Transactions on Signal Processing.

[43]  Pradeep Ravikumar,et al.  Elementary Estimators for High-Dimensional Linear Regression , 2014, ICML.

[44]  Dmitry Malioutov,et al.  Convex Total Least Squares , 2014, ICML.

[45]  Joel A. Tropp,et al.  User-Friendly Tail Bounds for Sums of Random Matrices , 2010, Found. Comput. Math..