Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all

Many inverse problems in signal processing deal with the robust estimation of unknown data from underdetermined linear observations. Low dimensional models, when combined with appropriate regularizers, have been shown to be efficient at performing this task. Sparse models with the 1-norm or low rank models with the nuclear norm are examples of such successful combinations. Stable recovery guarantees in these settings have been established using a common tool adapted to each case: the notion of restricted isometry property (RIP). In this paper, we establish generic RIP-based guarantees for the stable recovery of cones (positively homogeneous model sets) with arbitrary regularizers. These guarantees are illustrated on selected examples. For block structured sparsity in the infinite dimensional setting, we use the guarantees for a family of regularizers which efficiency in terms of RIP constant can be controlled, leading to stronger and sharper guarantees than the state of the art.

[1]  Saïd Ladjal,et al.  Robust Multi-image Processing with Optimal Sparse Regularization , 2014, Journal of Mathematical Imaging and Vision.

[2]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[3]  Thomas Blumensath,et al.  Sampling and Reconstructing Signals From a Union of Linear Subspaces , 2009, IEEE Transactions on Information Theory.

[4]  Ben Adcock,et al.  BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING , 2013, Forum of Mathematics, Sigma.

[5]  W. B. Johnson,et al.  Extensions of Lipschitz mappings into Hilbert space , 1984 .

[6]  Anders C. Hansen,et al.  On random and deterministic compressed sensing and the Restricted Isometry Property in levels , 2015, 2015 International Conference on Sampling Theory and Applications (SampTA).

[7]  Yonina C. Eldar,et al.  Block-Sparse Signals: Uncertainty Relations and Efficient Recovery , 2009, IEEE Transactions on Signal Processing.

[8]  Restricted Isometry Constants where p sparse , 2011 .

[9]  Bastian Goldlücke,et al.  Variational Analysis , 2014, Computer Vision, A Reference Guide.

[10]  Jean-Philippe Vert,et al.  Tight convex relaxations for sparse matrix factorization , 2014, NIPS.

[11]  Yonina C. Eldar,et al.  Robust Recovery of Signals From a Structured Union of Subspaces , 2008, IEEE Transactions on Information Theory.

[12]  Holger Rauhut,et al.  Uniform recovery of fusion frame structured sparse signals , 2014, ArXiv.

[13]  P. Malliavin Infinite dimensional analysis , 1993 .

[14]  M. Yuan,et al.  Model selection and estimation in regression with grouped variables , 2006 .

[15]  Patrick Pérez,et al.  Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems , 2013, IEEE Transactions on Information Theory.

[16]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[17]  Ben Adcock,et al.  Beyond Consistent Reconstructions: Optimality and Sharp Bounds for Generalized Sampling, and Application to the Uniform Resampling Problem , 2013, SIAM J. Math. Anal..

[18]  丸山 徹 Convex Analysisの二,三の進展について , 1977 .

[19]  Mike E. Davies,et al.  Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces , 2009, IEEE Transactions on Information Theory.

[20]  H. Feichtinger,et al.  An Exotic Minimal Banach Space of Functions , 2002 .

[21]  Joel A. Tropp,et al.  Living on the edge: phase transitions in convex programs with random data , 2013, 1303.6672.

[22]  Volkan Cevher,et al.  Model-Based Compressive Sensing , 2008, IEEE Transactions on Information Theory.

[23]  Rémi Gribonval,et al.  Recipes for stable linear embeddings from Hilbert spaces to R^m , 2015 .

[24]  Michael B. Wakin,et al.  New Analysis of Manifold Embeddings and Signal Recovery from Compressive Measurements , 2013, ArXiv.

[25]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[26]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[27]  Morten Nielsen,et al.  Beyond sparsity: Recovering structured representations by ${\ell}^1$ minimization and greedy algorithms , 2007, Adv. Comput. Math..

[28]  Richard G. Baraniuk,et al.  Stable Restoration and Separation of Approximately Sparse Signals , 2011, ArXiv.

[29]  D. Donoho For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution , 2006 .

[30]  Rémi Gribonval,et al.  Restricted Isometry Constants Where $\ell ^{p}$ Sparse Recovery Can Fail for $0≪ p \leq 1$ , 2009, IEEE Transactions on Information Theory.

[31]  Yonina C. Eldar,et al.  Compressed Sensing with Coherent and Redundant Dictionaries , 2010, ArXiv.

[32]  Emmanuel J. Candès,et al.  Matrix Completion With Noise , 2009, Proceedings of the IEEE.

[33]  Ben Adcock,et al.  Generalized sampling: stable reconstructions, inverse problems and compressed sensing over the continuum , 2013, ArXiv.

[34]  F. Bonsall A GENERAL ATOMIC DECOMPOSITION THEOREM AND BANACH'S CLOSED RANGE THEOREM , 1991 .

[35]  Anru Zhang,et al.  Sparse Representation of a Polytope and Recovery of Sparse Signals and Low-Rank Matrices , 2013, IEEE Transactions on Information Theory.

[36]  Ben Adcock,et al.  A Generalized Sampling Theorem for Stable Reconstructions in Arbitrary Bases , 2010, 1007.1852.

[37]  Davies Rémi Gribonval Restricted Isometry Constants Where Lp Sparse Recovery Can Fail for 0 , 2008 .

[38]  Kim C. Border,et al.  Infinite Dimensional Analysis: A Hitchhiker’s Guide , 1994 .

[39]  Holger Rauhut,et al.  A Mathematical Introduction to Compressive Sensing , 2013, Applied and Numerical Harmonic Analysis.

[40]  Nathan Srebro,et al.  Sparse Prediction with the $k$-Support Norm , 2012, NIPS.

[41]  Sjoerd Dirksen,et al.  Dimensionality Reduction with Subgaussian Matrices: A Unified Theory , 2014, Foundations of Computational Mathematics.

[42]  Ben Adcock,et al.  Generalized Sampling and Infinite-Dimensional Compressed Sensing , 2016, Found. Comput. Math..

[43]  James C. Robinson Dimensions, embeddings, and attractors , 2010 .

[44]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.