Interpretable Approximation of High-Dimensional Data
暂无分享,去创建一个
[1] FELIX BARTEL,et al. Grouped Transformations in High-Dimensional Explainable ANOVA Approximation , 2020, ArXiv.
[2] Markus Holtz,et al. Sparse Grid Quadrature in High Dimensions with Applications in Finance and Insurance , 2010, Lecture Notes in Computational Science and Engineering.
[3] Klaus-Robert Müller,et al. Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models , 2017, ArXiv.
[4] James A. Nichols,et al. Quasi-Monte Carlo finite element methods for elliptic PDEs with lognormal random coefficients , 2015, Numerische Mathematik.
[5] I. Sobola,et al. Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates , 2001 .
[6] Daniel Potts,et al. Learning multivariate functions with low-dimensional structures using polynomial bases , 2019, ArXiv.
[7] Wojciech Samek,et al. Methods for interpreting and understanding deep neural networks , 2017, Digit. Signal Process..
[8] Benjamin Recht,et al. Random Features for Large-Scale Kernel Machines , 2007, NIPS.
[9] Henryk Wozniakowski,et al. On decompositions of multivariate functions , 2009, Math. Comput..
[10] A. Owen,et al. Valuation of mortgage-backed securities using Brownian bridges to reduce effective dimension , 1997 .
[11] Brian C. Ross. Mutual Information between Discrete and Continuous Data Sets , 2014, PloS one.
[12] C. F. Jeff Wu,et al. Experiments , 2021, Wiley Series in Probability and Statistics.
[13] Zhu Li,et al. Towards a Unified Analysis of Random Fourier Features , 2018, ICML.
[14] Art B. Owen,et al. Effective Dimension of Some Weighted Pre-Sobolev Spaces with Dominating Mixed Partial Derivatives , 2017, SIAM J. Numer. Anal..
[15] D. Ruppert. The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .
[16] Jan Vybíral,et al. Learning Functions of Few Arbitrary Linear Parameters in High Dimensions , 2010, Found. Comput. Math..
[17] Joel A. Tropp,et al. User-Friendly Tail Bounds for Sums of Random Matrices , 2010, Found. Comput. Math..
[18] Kurt Hornik,et al. The support vector machine under test , 2003, Neurocomputing.
[19] Qiqi Wang,et al. Erratum: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces , 2013, SIAM J. Sci. Comput..
[20] Frances Y. Kuo,et al. Circulant embedding with QMC: analysis for elliptic PDE with lognormal coefficients , 2018, Numerische Mathematik.
[21] Martin J. Mohlenkamp,et al. Multivariate Regression and Machine Learning with Sums of Separable Functions , 2009, SIAM J. Sci. Comput..
[22] Robert Tibshirani,et al. The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2001, Springer Series in Statistics.
[23] Radford M. Neal. Pattern Recognition and Machine Learning , 2007, Technometrics.
[24] Rahul Thakur,et al. Exploratory Analysis of Machine Learning Techniques to predict Energy Efficiency in Buildings , 2020, 2020 8th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO).
[25] Gabriele Steidl,et al. Numerical Fourier Analysis , 2019, Fundamentals of Numerical Mathematics for Physicists and Engineers.
[26] R. DeVore,et al. Approximation of Functions of Few Variables in High Dimensions , 2011 .
[27] Rachel A. Ward,et al. A near-stationary subspace for ridge approximation , 2016, 1606.01929.
[28] A. Owen,et al. Estimating Mean Dimensionality of Analysis of Variance Decompositions , 2006 .
[29] Tino Ullrich,et al. L2-norm sampling discretization and recovery of functions from RKHS with finite trace , 2020, Sampling Theory, Signal Processing, and Data Analysis.
[30] Saltelli Andrea,et al. Global Sensitivity Analysis: The Primer , 2008 .
[31] Daniel Potts,et al. Approximation of High-Dimensional Periodic Functions with Fourier-Based Methods , 2021, SIAM J. Numer. Anal..
[32] Wolfgang Dahmen,et al. Fast high-dimensional approximation with sparse occupancy trees , 2011, J. Comput. Appl. Math..
[33] Rong Jin,et al. Efficient Kernel Clustering Using Random Fourier Features , 2012, 2012 IEEE 12th International Conference on Data Mining.
[34] Ufuk Topcu,et al. Function Approximation via Sparse Random Features , 2021, ArXiv.
[35] Stefan Kunis,et al. Using NFFT 3---A Software Library for Various Nonequispaced Fast Fourier Transforms , 2009, TOMS.
[36] H. Rabitz,et al. General foundations of high‐dimensional model representations , 1999 .
[37] Konstantinos G. Margaritis,et al. Multithreaded Local Learning Regularization Neural Networks for Regression Tasks , 2015, EANN.
[38] Rong Jin,et al. Nyström Method vs Random Fourier Features: A Theoretical and Empirical Comparison , 2012, NIPS.
[39] Michael A. Saunders,et al. LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares , 1982, TOMS.
[40] Ralf Klinkenberg,et al. Data Classification: Algorithms and Applications , 2014 .
[41] Frances Y. Kuo,et al. Application of Quasi-Monte Carlo Methods to Elliptic PDEs with Random Diffusion Coefficients: A Survey of Analysis and Implementation , 2016, Foundations of Computational Mathematics.
[42] Frances Y. Kuo,et al. Multi-level quasi-Monte Carlo finite element methods for a class of elliptic partial differential equations with random coefficients , 2012, 1208.6349.
[43] Chong Gu. Smoothing Spline Anova Models , 2002 .