Correcting for unknown errors in sparse high-dimensional function approximation
暂无分享,去创建一个
[1] Helmut Bölcskei,et al. Recovery of Sparsely Corrupted Signals , 2011, IEEE Transactions on Information Theory.
[2] D. Xiu,et al. STOCHASTIC COLLOCATION ALGORITHMS USING 𝓁 1 -MINIMIZATION , 2012 .
[3] Albert Cohen,et al. Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs , 2015 .
[4] R. DeVore,et al. Analytic regularity and polynomial approximation of parametric and stochastic elliptic PDEs , 2010 .
[5] Qiang Li,et al. Robust change point detection method via adaptive LAD-LASSO , 2020 .
[6] Ben Adcock,et al. Robustness to Unknown Error in Sparse Regularization , 2017, IEEE Transactions on Information Theory.
[7] Lars Grasedyck,et al. Hierarchical Tensor Approximation of Output Quantities of Parameter-Dependent PDEs , 2015, SIAM/ASA J. Uncertain. Quantification.
[8] C. D. Boor,et al. On multivariate polynomial interpolation , 1990 .
[9] Fabio Nobile,et al. Analysis of Discrete $$L^2$$L2 Projection on Polynomial Spaces with Random Evaluations , 2014, Found. Comput. Math..
[10] Cun-Hui Zhang,et al. Adaptive Lasso for sparse high-dimensional regression models , 2008 .
[11] Dongcai Su,et al. Data recovery from corrupted observations via l1 minimization , 2016, ArXiv.
[12] Victor Chernozhukov,et al. Pivotal estimation via square-root Lasso in nonparametric regression , 2014 .
[13] Anders C. Hansen,et al. On the absence of the RIP in real-world applications of compressed sensing and the RIP in levels , 2014, ArXiv.
[14] Seung Jun Baek,et al. Sufficient Conditions on Stable Recovery of Sparse Signals With Partial Support Information , 2013, IEEE Signal Processing Letters.
[15] Petre Stoica,et al. Connection between SPICE and Square-Root LASSO for sparse parameter estimation , 2014, Signal Process..
[16] Joshua R. Loftus,et al. Selective inference with unknown variance via the square-root lasso , 2015, Biometrika.
[17] George G. Lorentz,et al. Solvability problems of bivariate interpolation I , 1986 .
[18] Dongcai Su. Compressed sensing with corrupted Fourier measurements , 2016, ArXiv.
[19] Nira Dyn,et al. Multivariate polynomial interpolation on lower sets , 2014, J. Approx. Theory.
[20] Dongbin Xiu,et al. Correcting Data Corruption Errors for Multivariate Function Approximation , 2016, SIAM J. Sci. Comput..
[21] Frédéric Hecht,et al. New development in freefem++ , 2012, J. Num. Math..
[22] Ben Adcock,et al. BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING , 2013, Forum of Mathematics, Sigma.
[23] Albert Cohen,et al. Discrete least squares polynomial approximation with random evaluations − application to parametric and stochastic elliptic PDEs , 2015 .
[24] Sara van de Geer,et al. Estimation and Testing Under Sparsity: École d'Été de Probabilités de Saint-Flour XLV – 2015 , 2016 .
[25] Stephen P. Boyd,et al. Graph Implementations for Nonsmooth Convex Programs , 2008, Recent Advances in Learning and Control.
[26] John Wright,et al. Dense Error Correction Via $\ell^1$-Minimization , 2010, IEEE Transactions on Information Theory.
[27] Xiu Yang,et al. Reweighted ℓ1ℓ1 minimization method for stochastic elliptic differential equations , 2013, J. Comput. Phys..
[28] Ben Adcock,et al. Compressed Sensing with Sparse Corruptions: Fault-Tolerant Sparse Collocation Approximations , 2017, SIAM/ASA J. Uncertain. Quantification.
[29] Richard G. Baraniuk,et al. Exact signal recovery from sparsely corrupted measurements through the Pursuit of Justice , 2009, 2009 Conference Record of the Forty-Third Asilomar Conference on Signals, Systems and Computers.
[30] Florentina Bunea,et al. The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms , 2013, IEEE Transactions on Information Theory.
[31] H. Rauhut,et al. Interpolation via weighted $l_1$ minimization , 2013, 1308.0759.
[32] Sylvain Arlot,et al. A survey of cross-validation procedures for model selection , 2009, 0907.4728.
[33] Ben Adcock,et al. Infinite-dimensional $\ell^1$ minimization and function approximation from pointwise data , 2015, 1503.02352.
[34] Khachik Sargsyan,et al. Enhancing ℓ1-minimization estimates of polynomial chaos expansions using basis selection , 2014, J. Comput. Phys..
[35] R. Tibshirani. Regression Shrinkage and Selection via the Lasso , 1996 .
[36] John Wright,et al. Dense Error Correction via L1-Minimization , 2008, 0809.0199.
[37] Houman Owhadi,et al. A non-adapted sparse approximation of PDEs with stochastic inputs , 2010, J. Comput. Phys..
[38] R. Tempone,et al. Stochastic Spectral Galerkin and Collocation Methods for PDEs with Random Coefficients: A Numerical Comparison , 2011 .
[39] Holger Rauhut,et al. Compressive sensing Petrov-Galerkin approximation of high-dimensional parametric operator equations , 2014, Math. Comput..
[40] Xiaodong Li,et al. Compressed Sensing and Matrix Completion with Constant Proportion of Corruptions , 2011, Constructive Approximation.
[41] Albert Cohen,et al. Convergence Rates of Best N-term Galerkin Approximations for a Class of Elliptic sPDEs , 2010, Found. Comput. Math..
[42] H. Dette,et al. The adaptive lasso in high-dimensional sparse heteroscedastic models , 2013 .
[43] Ben Adcock,et al. Recovery guarantees for Compressed Sensing with unknown errors , 2017, 2017 International Conference on Sampling Theory and Applications (SampTA).
[44] Sara A. van de Geer,et al. Sharp Oracle Inequalities for Square Root Regularization , 2015, J. Mach. Learn. Res..
[45] Hoang Tran,et al. Polynomial approximation via compressed sensing of high-dimensional functions on lower sets , 2016, Math. Comput..
[46] Stephen P. Boyd,et al. Enhancing Sparsity by Reweighted ℓ1 Minimization , 2007, 0711.1612.
[47] Olcay Arslan,et al. Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression , 2012, Comput. Stat. Data Anal..
[48] H. Zou. The Adaptive Lasso and Its Oracle Properties , 2006 .
[49] Trac D. Tran,et al. Exact Recoverability From Dense Corrupted Observations via $\ell _{1}$-Minimization , 2011, IEEE Transactions on Information Theory.
[50] C. O’Brien. Statistical Learning with Sparsity: The Lasso and Generalizations , 2016 .
[51] Xiaoli Gao. Penalized methods for high-dimensional least absolute deviations regression , 2008 .
[52] Laurent El Ghaoui,et al. Robust sketching for multiple square-root LASSO problems , 2014, AISTATS.
[53] Hansheng Wang,et al. Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso , 2007 .
[54] Cun-Hui Zhang,et al. Scaled sparse linear regression , 2011, 1104.4595.
[55] Ben Adcock,et al. Infinite-Dimensional Compressed Sensing and Function Interpolation , 2015, Foundations of Computational Mathematics.
[56] Ben Adcock,et al. Compressed Sensing Approaches for Polynomial Approximation of High-Dimensional Functions , 2017, 1703.06987.
[57] Xiaoli Gao,et al. Asymptotic analysis of high-dimensional LAD regression with Lasso , 2016 .
[58] Srdjan Stankovic,et al. Missing samples analysis in signals for applications to L-estimation and compressive sensing , 2014, Signal Process..
[59] Albert Cohen,et al. High-Dimensional Adaptive Sparse Polynomial Interpolation and Applications to Parametric PDEs , 2013, Foundations of Computational Mathematics.
[60] B. Logan,et al. Signal recovery and the large sieve , 1992 .
[61] David L Donoho,et al. Compressed sensing , 2006, IEEE Transactions on Information Theory.
[62] Sara van de Geer,et al. Ecole d'été de probabilités de Saint-Flour XLV , 2016 .
[63] Emmanuel J. Candès,et al. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.
[64] Kurt B. Ferreira,et al. Fault-tolerant linear solvers via selective reliability , 2012, ArXiv.
[65] Hassan Mansour,et al. Recovering Compressively Sampled Signals Using Partial Support Information , 2010, IEEE Transactions on Information Theory.
[66] Jinfeng Xu,et al. Simultaneous estimation and variable selection in median regression using Lasso-type penalty , 2010, Annals of the Institute of Statistical Mathematics.
[67] Alireza Doostan,et al. A weighted l1-minimization approach for sparse polynomial chaos expansions , 2013, J. Comput. Phys..
[68] A. Belloni,et al. Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming , 2011 .
[69] 慧 廣瀬. A Mathematical Introduction to Compressive Sensing , 2015 .