暂无分享,去创建一个
[1] A. Samarov. Exploring Regression Structure Using Nonparametric Functional Estimation , 1993 .
[2] Michael L. Stein,et al. Interpolation of spatial data , 1999 .
[3] R. Bellman. Dynamic programming. , 1957, Science.
[4] Stefan M. Wild,et al. Derivative-free optimization methods , 2019, Acta Numerica.
[5] M. Fréchet. Les éléments aléatoires de nature quelconque dans un espace distancié , 1948 .
[6] J. Friedman,et al. Projection Pursuit Regression , 1981 .
[7] Stephen Barnett,et al. Matrix Methods for Engineers and Scientists , 1982 .
[8] Liping Zhu,et al. A Review on Dimension Reduction , 2013, International statistical review = Revue internationale de statistique.
[9] A. Atiya,et al. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.
[10] Amandine Marrel,et al. Estimation of the Derivative-Based Global Sensitivity Measures Using a Gaussian Process Metamodel , 2016, SIAM/ASA J. Uncertain. Quantification.
[11] Ilias Bilionis,et al. Gaussian processes with built-in dimensionality reduction: Applications in high-dimensional uncertainty propagation , 2016, 1602.04550.
[12] Qiqi Wang,et al. Erratum: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces , 2013, SIAM J. Sci. Comput..
[13] Pramudita Satria Palar,et al. Exploiting active subspaces in global optimization: how complex is your problem? , 2017, GECCO.
[14] Stefan M. Wild,et al. Estimating Derivatives of Noisy Simulations , 2012, TOMS.
[15] Jorge Nocedal,et al. Remark on “algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound constrained optimization” , 2011, TOMS.
[16] Ker-Chau Li,et al. On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma , 1992 .
[17] Neil D. Lawrence,et al. Bayesian Gaussian Process Latent Variable Model , 2010, AISTATS.
[18] Ilse C. F. Ipsen,et al. A Probabilistic Subspace Bound with Application to Active Subspaces , 2018, SIAM J. Matrix Anal. Appl..
[19] K. Fukumizu,et al. Gradient-Based Kernel Dimension Reduction for Regression , 2014 .
[20] Andy J. Keane,et al. Dimension Reduction for Aerodynamic Design Optimization , 2011 .
[21] T Labopin-Richard,et al. Sequential design of experiments for estimating percentiles of black-box functions , 2016, 1605.05524.
[22] Juan J. Alonso,et al. On Active Subspaces in Car Aerodynamics , 2016 .
[23] I. Sobola,et al. Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates , 2001 .
[24] Sucharita Ghosh,et al. Kernel Smoothing: Principles, Methods and Applications: Principles, Methods and Applications , 2017 .
[25] David Ginsbourger,et al. ANOVA kernels and RKHS of zero mean functions for model-based sensitivity analysis , 2011, J. Multivar. Anal..
[26] Ker-Chau Li,et al. Sliced Inverse Regression for Dimension Reduction , 1991 .
[27] Andy J. Keane,et al. Engineering Design via Surrogate Modelling - A Practical Guide , 2008 .
[28] Pramudita Satria Palar,et al. On The Accuracy of Kriging Model in Active Subspaces , 2018 .
[29] Shigeru Obayashi,et al. Kriging surrogate model with coordinate transformation based on likelihood and gradient , 2017, J. Glob. Optim..
[30] David Ginsbourger,et al. Additive Kernels for Gaussian Process Modeling , 2011, 1103.4023.
[31] Nando de Freitas,et al. Bayesian Optimization in a Billion Dimensions via Random Embeddings , 2013, J. Artif. Intell. Res..
[32] Nabil Rachdi,et al. New sensitivity analysis subordinated to a contrast , 2013, 1305.2329.
[33] Andreas Krause,et al. High-Dimensional Gaussian Process Bandits , 2013, NIPS.
[34] C. Eckart,et al. The approximation of one matrix by another of lower rank , 1936 .
[35] Mike Ludkovski,et al. Replication or Exploration? Sequential Design for Stochastic Simulation Experiments , 2017, Technometrics.
[36] Paul G. Constantine,et al. Data-Driven Polynomial Ridge Approximation Using Variable Projection , 2017, SIAM J. Sci. Comput..
[37] Kaare Brandt Petersen,et al. The Matrix Cookbook , 2006 .
[38] Michael B. Wakin,et al. Computing active subspaces efficiently with gradient sketching , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).
[39] Carl E. Rasmussen,et al. Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.
[40] Bing Li,et al. Inverse regression-based uncertainty quantification algorithms for high-dimensional models: Theory and practice , 2016, J. Comput. Phys..
[41] B. Efron. Nonparametric estimates of standard error: The jackknife, the bootstrap and other methods , 1981 .
[42] Malek Ben Salem,et al. Sequential dimension reduction for learning features of expensive black-box functions , 2019 .
[43] Minyong R. Lee,et al. Modified Active Subspaces Using the Average of Gradients , 2019, SIAM/ASA J. Uncertain. Quantification.
[44] L. Mirsky. SYMMETRIC GAUGE FUNCTIONS AND UNITARILY INVARIANT NORMS , 1960 .
[45] B. Iooss,et al. A Review on Global Sensitivity Analysis Methods , 2014, 1404.2405.
[46] Roman Garnett,et al. Active Learning of Linear Embeddings for Gaussian Processes , 2013, UAI.
[47] Paul G. Constantine,et al. Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments , 2017, Statistics and computing.
[48] Carl E. Rasmussen,et al. Additive Gaussian Processes , 2011, NIPS.
[49] Christopher K. I. Williams,et al. Discovering Hidden Features with Gaussian Processes Regression , 1998, NIPS.
[50] Chih-Li Sung,et al. Multiresolution Functional ANOVA for Large-Scale, Many-Input Computer Experiments , 2017, Journal of the American Statistical Association.