Infinite-Dimensional Compressed Sensing and Function Interpolation

We introduce and analyse a framework for function interpolation using compressed sensing. This framework—which is based on weighted $$\ell ^1$$ℓ1 minimization—does not require a priori bounds on the expansion tail in either its implementation or its theoretical guarantees and in the absence of noise leads to genuinely interpolatory approximations. We also establish a new recovery guarantee for compressed sensing with weighted $$\ell ^1$$ℓ1 minimization based on this framework. This guarantee conveys several benefits. First, unlike existing results, it is sharp (up to constants and log factors) for large classes of functions regardless of the choice of weights. Second, by examining the measurement condition in the recovery guarantee, we are able to suggest a good overall strategy for selecting the weights. In particular, when applied to the important case of multivariate approximation with orthogonal polynomials, this weighting strategy leads to provably optimal estimates on the number of measurements required, whenever the support set of the significant coefficients is a so-called lower set. Finally, this guarantee can also be used to theoretically confirm the benefits of alternative weighting strategies where the weights are chosen based on prior support information. This provides a theoretical basis for a number of recent numerical studies showing the effectiveness of such approaches.

[1]  H. Rauhut,et al.  Sparse recovery for spherical harmonic expansions , 2011, 1102.4097.

[2]  Holger Rauhut,et al.  Sparse Legendre expansions via l1-minimization , 2012, J. Approx. Theory.

[3]  Pierre Weiss,et al.  An Analysis of Block Sampling Strategies in Compressed Sensing , 2013, IEEE Transactions on Information Theory.

[4]  Hoang Tran,et al.  Polynomial approximation via compressed sensing of high-dimensional functions on lower sets , 2016, Math. Comput..

[5]  H. Rauhut,et al.  Interpolation via weighted $l_1$ minimization , 2013, 1308.0759.

[6]  Felix Krahmer,et al.  A Partial Derandomization of PhaseLift Using Spherical Designs , 2013, Journal of Fourier Analysis and Applications.

[7]  Alexey Chernov,et al.  New explicit-in-dimension estimates for the cardinality of high-dimensional hyperbolic crosses and approximation of functions having mixed smoothness , 2013, J. Complex..

[8]  Ben Adcock,et al.  Generalized Sampling and Infinite-Dimensional Compressed Sensing , 2016, Found. Comput. Math..

[9]  Michael P. Friedlander,et al.  Probing the Pareto Frontier for Basis Pursuit Solutions , 2008, SIAM J. Sci. Comput..

[10]  D. Xiu,et al.  STOCHASTIC COLLOCATION ALGORITHMS USING 𝓁 1 -MINIMIZATION , 2012 .

[11]  Akil C. Narayan,et al.  A Generalized Sampling and Preconditioning Scheme for Sparse Approximation of Polynomial Chaos Expansions , 2016, SIAM J. Sci. Comput..

[12]  Giovanni Migliorati,et al.  Polynomial approximation by means of the random discrete L2 projection and application to inverse problems for PDEs with stochastic data , 2013 .

[13]  Raul Tempone,et al.  MATHICSE Technical Report : Analysis of the discrete $L^2$ projection on polynomial spaces with random evaluations , 2011 .

[14]  慧 廣瀬 A Mathematical Introduction to Compressive Sensing , 2015 .

[15]  Houman Owhadi,et al.  A non-adapted sparse approximation of PDEs with stochastic inputs , 2010, J. Comput. Phys..

[16]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[17]  Patrick Pérez,et al.  Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems , 2013, IEEE Transactions on Information Theory.

[18]  Alireza Doostan,et al.  Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies , 2014, J. Comput. Phys..

[19]  Holger Rauhut,et al.  A Mathematical Introduction to Compressive Sensing , 2013, Applied and Numerical Harmonic Analysis.

[20]  Alireza Doostan,et al.  A weighted l1-minimization approach for sparse polynomial chaos expansions , 2013, J. Comput. Phys..

[21]  W. Gautschi HOW SHARP IS BERNSTEIN'S INEQUALITY FOR JACOBI POLYNOMIALS? , 2009 .

[22]  Lee Lorch Alternative Proof of a Sharpened Form of Bernstein's Inequality for Legendre Polynomials , 1983 .

[23]  Kyle A. Gallivan,et al.  A compressed sensing approach for partial differential equations with random input data , 2012 .

[24]  Ben Adcock,et al.  Compressed Sensing and Parallel Acquisition , 2016, IEEE Transactions on Information Theory.

[25]  Xiu Yang,et al.  Reweighted ℓ1ℓ1 minimization method for stochastic elliptic differential equations , 2013, J. Comput. Phys..

[26]  Gary Tang,et al.  Subsampled Gauss Quadrature Nodes for Estimating Polynomial Chaos Expansions , 2014, SIAM/ASA J. Uncertain. Quantification.

[27]  J CandesEmmanuel,et al.  A Probabilistic and RIPless Theory of Compressed Sensing , 2011 .

[28]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[29]  Hassan Mansour,et al.  Recovering Compressively Sampled Signals Using Partial Support Information , 2010, IEEE Transactions on Information Theory.

[30]  Ben Adcock,et al.  Infinite-dimensional $\ell^1$ minimization and function approximation from pointwise data , 2015, 1503.02352.

[31]  Guannan Zhang,et al.  Analysis of quasi-optimal polynomial approximations for parameterized PDEs with deterministic and stochastic coefficients , 2015, Numerische Mathematik.

[32]  Fabio Nobile,et al.  Analysis of Discrete $$L^2$$L2 Projection on Polynomial Spaces with Random Evaluations , 2014, Found. Comput. Math..

[33]  R. Platte,et al.  Optimal sampling rates for approximating analytic functions from pointwise samples , 2016, 1610.04769.

[34]  Tao Zhou,et al.  Stochastic collocation on unstructured multivariate meshes , 2015, 1501.05891.

[35]  R. DeVore,et al.  Analytic regularity and polynomial approximation of parametric and stochastic elliptic PDEs , 2010 .

[36]  David Gross,et al.  Recovering Low-Rank Matrices From Few Coefficients in Any Basis , 2009, IEEE Transactions on Information Theory.

[37]  Albert Cohen,et al.  Convergence Rates of Best N-term Galerkin Approximations for a Class of Elliptic sPDEs , 2010, Found. Comput. Math..

[38]  Tao Zhou,et al.  On Sparse Interpolation and the Design of Deterministic Interpolation Points , 2013, SIAM J. Sci. Comput..

[39]  W. Sickel,et al.  Approximation of Mixed Order Sobolev Functions on the d-Torus: Asymptotics, Preasymptotics, and d-Dependence , 2013, 1312.6386.

[40]  G. Migliorati,et al.  Multivariate Markov-type and Nikolskii-type inequalities for polynomials associated with downward closed multi-index sets , 2015, J. Approx. Theory.

[41]  Tao Zhou,et al.  A Christoffel function weighted least squares algorithm for collocation approximations , 2014, Math. Comput..

[42]  Lloyd N. Trefethen,et al.  Impossibility of Fast Stable Approximation of Analytic Functions from Equispaced Samples , 2011, SIAM Rev..

[43]  Rene F. Swarttouw,et al.  Orthogonal polynomials , 2020, NIST Handbook of Mathematical Functions.

[44]  Albert Cohen,et al.  Discrete least squares polynomial approximation with random evaluations − application to parametric and stochastic elliptic PDEs , 2015 .

[45]  Rachel Ward,et al.  The Sample Complexity of Weighted Sparse Approximation , 2015, IEEE Transactions on Signal Processing.

[46]  Seung Jun Baek,et al.  Sufficient Conditions on Stable Recovery of Sparse Signals With Partial Support Information , 2013, IEEE Signal Processing Letters.