ROBUST UNCERTAINTY QUANTIFICATION USING PRECONDITIONED LEAST-SQUARES POLYNOMIAL APPROXIMATIONS WITH l1-REGULARIZATION

We propose a non-iterative robust numerical method for the non-intrusive uncertainty quantification of multivariate stochastic problems with reasonably compressible polynomial representations. The approximation is robust to data outliers or noisy evaluations which do not fall under the regularity assumption of a stochastic truncation error but pertains to a more complete error model, capable of handling interpretations of physical/computational model (or measurement) errors. The method relies on the cross-validation of a pseudospectral projection of the response on generalized Polynomial Chaos approximation bases; this allows an initial model selection and assessment yielding a preconditioned response. We then apply a `1−penalized regression to the preconditioned response variable. Nonlinear test cases have shown this approximation to be more effective in reducing the effect of scattered data outliers than standard compressed sensing techniques and of comparable efficiency to iterated robust regression techniques.

[1]  Gary Tang,et al.  Subsampled Gauss Quadrature Nodes for Estimating Polynomial Chaos Expansions , 2014, SIAM/ASA J. Uncertain. Quantification.

[2]  Alireza Doostan,et al.  Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies , 2014, J. Comput. Phys..

[3]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[4]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.

[5]  Houman Owhadi,et al.  A non-adapted sparse approximation of PDEs with stochastic inputs , 2010, J. Comput. Phys..

[6]  Alireza Doostan,et al.  Coherence motivated sampling and convergence analysis of least squares polynomial Chaos regression , 2014, 1410.1931.

[7]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[8]  Menner A. Tatang,et al.  An efficient method for parametric uncertainty analysis of numerical geophysical models , 1997 .

[9]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[10]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[11]  Albert Cohen,et al.  Discrete least squares polynomial approximation with random evaluations − application to parametric and stochastic elliptic PDEs , 2015 .

[12]  Danfu Han,et al.  A weighted least squares method for scattered data fitting , 2008 .

[13]  Dongbin Xiu,et al.  High-Order Collocation Methods for Differential Equations with Random Inputs , 2005, SIAM J. Sci. Comput..

[14]  Emmanuel J. Candès,et al.  NESTA: A Fast and Accurate First-Order Method for Sparse Recovery , 2009, SIAM J. Imaging Sci..

[15]  Jean-Jacques Fuchs,et al.  Recovery of exact sparse representations in the presence of bounded noise , 2005, IEEE Transactions on Information Theory.

[16]  A. Resmini,et al.  Sparse grids‐based stochastic approximations with applications to aerodynamics sensitivity analysis , 2016 .

[17]  Holger Rauhut,et al.  Sparse Legendre expansions via l1-minimization , 2012, J. Approx. Theory.

[18]  R. Ghanem,et al.  Stochastic Finite Elements: A Spectral Approach , 1990 .

[19]  Omar M. Knio,et al.  Spectral Methods for Uncertainty Quantification , 2010 .

[20]  D. Xiu,et al.  STOCHASTIC COLLOCATION ALGORITHMS USING 𝓁 1 -MINIMIZATION , 2012 .

[21]  Dongbin Xiu,et al.  The Wiener-Askey Polynomial Chaos for Stochastic Differential Equations , 2002, SIAM J. Sci. Comput..

[22]  Bruno Sudret,et al.  Adaptive sparse polynomial chaos expansion based on least angle regression , 2011, J. Comput. Phys..

[23]  Annette M. Molinaro,et al.  Prediction error estimation: a comparison of resampling methods , 2005, Bioinform..

[24]  Michael P. Friedlander,et al.  Probing the Pareto Frontier for Basis Pursuit Solutions , 2008, SIAM J. Sci. Comput..

[25]  Eric R. Ziegel,et al.  The Elements of Statistical Learning , 2003, Technometrics.

[26]  Hester Bijl,et al.  Uncertainty Quantification in Computational Fluid Dynamics , 2013, Lecture Notes in Computational Science and Engineering.

[27]  Kyle A. Gallivan,et al.  A compressed sensing approach for partial differential equations with random input data , 2012 .

[28]  D. Ruppert,et al.  A Note on Computing Robust Regression Estimates via Iteratively Reweighted Least Squares , 1988 .

[29]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[30]  P. Green Iteratively reweighted least squares for maximum likelihood estimation , 1984 .

[31]  Youssef M. Marzouk,et al.  Adaptive Smolyak Pseudospectral Approximations , 2012, SIAM J. Sci. Comput..

[32]  N. Cutland,et al.  On homogeneous chaos , 1991, Mathematical Proceedings of the Cambridge Philosophical Society.

[33]  W. T. Martin,et al.  The Orthogonal Development of Non-Linear Functionals in Series of Fourier-Hermite Functionals , 1947 .

[34]  L. Breiman,et al.  Submodel selection and evaluation in regression. The X-random case , 1992 .

[35]  Peter E. Thornton,et al.  DIMENSIONALITY REDUCTION FOR COMPLEX MODELS VIA BAYESIAN COMPRESSIVE SENSING , 2014 .

[36]  Fabio Nobile,et al.  Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points , 2015, J. Multivar. Anal..

[37]  Tao Tang,et al.  On Discrete Least-Squares Projection in Unbounded Domain with Random Evaluations and its Application to Parametric Uncertainty Quantification , 2014, SIAM J. Sci. Comput..

[38]  Michael Elad,et al.  Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.

[39]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[40]  Dongbin Xiu,et al.  Weighted discrete least-squares polynomial approximation using randomized quadratures , 2015, J. Comput. Phys..

[41]  Khachik Sargsyan,et al.  Enhancing ℓ1-minimization estimates of polynomial chaos expansions using basis selection , 2014, J. Comput. Phys..

[42]  B. Ripley,et al.  Robust Statistics , 2018, Encyclopedia of Mathematical Geosciences.

[43]  Fabio Nobile,et al.  Analysis of Discrete $$L^2$$L2 Projection on Polynomial Spaces with Random Evaluations , 2014, Found. Comput. Math..

[44]  GAËL POËTTE,et al.  Iterative Polynomial Approximation Adapting to Arbitrary Probability Distribution , 2015, SIAM J. Numer. Anal..

[45]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[46]  R. Grandhi,et al.  Polynomial Chaos Expansion with Latin Hypercube Sampling for Estimating Response Variability , 2003 .

[47]  Rachel Ward,et al.  Compressed Sensing With Cross Validation , 2008, IEEE Transactions on Information Theory.

[48]  Albert Cohen,et al.  On the Stability and Accuracy of Least Squares Approximations , 2011, Foundations of Computational Mathematics.

[49]  Jacques Peter,et al.  Goal oriented mesh adaptation using total derivative of aerodynamic functions with respect to mesh coordinates – With applications to Euler flows , 2012 .

[50]  Jie Shen,et al.  Sparse Spectral Approximations of High-Dimensional Problems Based on Hyperbolic Cross , 2010, SIAM J. Numer. Anal..

[51]  Joel A. Tropp,et al.  Just relax: convex programming methods for identifying sparse signals in noise , 2006, IEEE Transactions on Information Theory.

[52]  M. Lemaire,et al.  Stochastic finite element: a non intrusive approach by regression , 2006 .