Fast hyperbolic wavelet regression meets ANOVA

We use hyperbolic wavelet regression for the fast reconstruction of high-dimensional functions having only low dimensional variable interactions. Compactly supported periodic Chui-Wang wavelets are used for the tensorized hyperbolic wavelet basis. In a first step we give a self-contained characterization of tensor product Sobolev-Besov spaces on the d-torus with arbitrary smoothness in terms of the decay of such wavelet coefficients. In the second part we perform and analyze scattered-data approximation using a hyperbolic cross type truncation of the basis expansion for the associated least squares method. The corresponding system matrix is sparse due to the compact support of the wavelets, which leads to a significant acceleration of the matrix vector multiplication. In case of i.i.d. samples we can even bound the approximation error with high probability by loosing only log-terms that do not depend on d compared to the best approximation. In addition, if the function has low effective dimension (i.e. only interactions of few variables), we qualitatively determine the variable interactions and omit ANOVA terms with low variance in a second step in order to increase the accuracy. This allows us to suggest an adapted model for the approximation. Numerical results show the efficiency of the proposed method.

[1]  H. Bungartz,et al.  Sparse grids , 2004, Acta Numerica.

[2]  Henryk Wozniakowski,et al.  On decompositions of multivariate functions , 2009, Math. Comput..

[3]  A. Owen,et al.  Valuation of mortgage-backed securities using Brownian bridges to reduce effective dimension , 1997 .

[4]  Winfried Sickel,et al.  Approximation numbers of Sobolev embeddings - Sharp constants and tractability , 2014, J. Complex..

[5]  C. F. Jeff Wu,et al.  Experiments , 2021, Wiley Series in Probability and Statistics.

[6]  Michael Griebel,et al.  Sparse grids for boundary integral equations , 1999, Numerische Mathematik.

[7]  Albert Cohen,et al.  Discrete least squares polynomial approximation with random evaluations − application to parametric and stochastic elliptic PDEs , 2015 .

[8]  T. Ishigami,et al.  An importance quantification technique in uncertainty analysis for computer models , 1990, [1990] Proceedings. First International Symposium on Uncertainty Modeling and Analysis.

[9]  R. DeVore,et al.  Hyperbolic Wavelet Approximation , 1998 .

[10]  M. Griebel,et al.  Optimized Tensor-Product Approximation Spaces , 2000 .

[11]  Ingrid Daubechies,et al.  Ten Lectures on Wavelets , 1992 .

[12]  Tino Ullrich,et al.  N-Widths and ε-Dimensions for High-Dimensional Approximations , 2013, Found. Comput. Math..

[13]  E. Novak,et al.  Tractability of Multivariate Problems, Volume III: Standard Information for Operators. , 2012 .

[14]  Amara Lynn Graps,et al.  An introduction to wavelets , 1995 .

[15]  S. B. Stechkin Approximation of periodic functions , 1974 .

[16]  Albert Cohen,et al.  Optimal pointwise sampling for L2 approximation , 2022, J. Complex..

[17]  A. Cohen,et al.  Optimal weighted least-squares methods , 2016, 1608.00512.

[18]  Markus Holtz,et al.  Sparse Grid Quadrature in High Dimensions with Applications in Finance and Insurance , 2010, Lecture Notes in Computational Science and Engineering.

[19]  Frances Y. Kuo,et al.  The smoothing effect of the ANOVA decomposition , 2010, J. Complex..

[20]  Daniel Potts,et al.  Approximation of High-Dimensional Periodic Functions with Fourier-Based Methods , 2021, SIAM J. Numer. Anal..

[21]  A. Owen,et al.  Estimating Mean Dimensionality of Analysis of Variance Decompositions , 2006 .

[22]  I. Sobola,et al.  Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates , 2001 .

[23]  Daniel Potts,et al.  Interpretable Approximation of High-Dimensional Data , 2021, SIAM J. Math. Data Sci..

[24]  Winfried Sickel,et al.  Tensor products of Sobolev-Besov spaces and applications to approximation from the hyperbolic cross , 2009, J. Approx. Theory.

[25]  Albert Cohen,et al.  On the Stability and Accuracy of Least Squares Approximations , 2011, Foundations of Computational Mathematics.

[26]  Barbara Kaltenbacher,et al.  Iterative Solution Methods , 2015, Handbook of Mathematical Methods in Imaging.

[27]  Vladimir N. Temlyakov,et al.  Hyperbolic Cross Approximation , 2016, 1601.03978.

[28]  Joel A. Tropp,et al.  User-Friendly Tail Bounds for Sums of Random Matrices , 2010, Found. Comput. Math..

[29]  Toni Volkmer,et al.  Worst case recovery guarantees for least squares approximation using random samples , 2019, ArXiv.

[30]  David Ginsbourger,et al.  ANOVA kernels and RKHS of zero mean functions for model-based sensitivity analysis , 2011, J. Multivar. Anal..

[31]  E. Novak,et al.  Tractability of Multivariate Problems Volume II: Standard Information for Functionals , 2010 .

[32]  Andreas Christmann,et al.  Support vector machines , 2008, Data Mining and Knowledge Discovery Handbook.

[33]  Michael Griebel,et al.  Error Estimates for Multivariate Regression on Discretized Function Spaces , 2017, SIAM J. Numer. Anal..

[34]  H. Yserentant Regularity and Approximability of Electronic Wave Functions , 2010 .

[35]  Jochen Garcke,et al.  Sparse Grids in a Nutshell , 2012 .

[36]  Bastian Bohn,et al.  On the Convergence Rate of Sparse Grid Least Squares Regression , 2018 .

[37]  Gabriele Steidl,et al.  Numerical Fourier Analysis , 2019, Fundamentals of Numerical Mathematics for Physicists and Engineers.

[38]  R. DeVore,et al.  Approximation of Functions of Few Variables in High Dimensions , 2011 .

[39]  B. Vidakovic,et al.  Least Squares Wavelet-based Estimation for Additive Regression Models using Non Equally-Spaced Designs , 2018, 1804.03015.