Computing active subspaces efficiently with gradient sketching

Active subspaces are an emerging set of tools for identifying and exploiting the most important directions in the space of a computer simulation's input parameters; these directions depend on the simulation's quantity of interest, which we treat as a function from inputs to outputs. To identify a function's active subspace, one must compute the eigenpairs of a matrix derived from the function's gradient, which presents challenges when the gradient is not available as a subroutine. We numerically study two methods for estimating the necessary eigenpairs using only linear measurements of the function's gradient. In practice, these measurements can be estimated by finite differences using only two function evaluations, regardless of the dimension of the function's input space.

[1]  Clayton G. Webster,et al.  A GRADIENT-BASED SAMPLING APPROACH FOR DIMENSION REDUCTION OF PARTIAL DIFFERENTIAL EQUATIONS WITH STOCHASTIC COEFFICIENTS , 2015 .

[2]  Dimitri N. Mavris,et al.  Dimensionality Reduction Using Principal Component Analysis Applied to the Gradient , 2015 .

[3]  Trent Michael Russi,et al.  Uncertainty Quantification with Experimental Data and Complex System Models , 2010 .

[4]  Roger G. Ghanem,et al.  Basis adaptation in homogeneous chaos spaces , 2014, J. Comput. Phys..

[5]  J. Polzehl,et al.  Structure adaptive approach for dimension reduction , 2001 .

[6]  Hany S. Abdel-Khalik,et al.  Hybrid reduced order modeling applied to nonlinear models , 2012 .

[7]  W. Marsden I and J , 2012 .

[8]  Hanchao Qi,et al.  Invariance of principal components under low-dimensional random projection of the data , 2012, 2012 19th IEEE International Conference on Image Processing.

[9]  Ker-Chau Li,et al.  On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma , 1992 .

[10]  Andreas Griewank,et al.  Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition , 2000, Frontiers in applied mathematics.

[11]  Jan Vybíral,et al.  Learning Functions of Few Arbitrary Linear Parameters in High Dimensions , 2010, Found. Comput. Math..

[12]  Neil Genzlinger A. and Q , 2006 .

[13]  R. H. Moore,et al.  Regression Graphics: Ideas for Studying Regressions Through Graphics , 1998, Technometrics.

[14]  Johan Larsson,et al.  Exploiting active subspaces to quantify uncertainty in the numerical simulation of the HyShot II scramjet , 2014, J. Comput. Phys..

[15]  V. Cevher,et al.  Learning Non-Parametric Basis Independent Models from Point Queries via Low-Rank Methods , 2013, 1310.1826.

[16]  Alex Gittens,et al.  TAIL BOUNDS FOR ALL EIGENVALUES OF A SUM OF RANDOM MATRICES , 2011, 1104.4513.

[17]  Paul G. Constantine,et al.  Active Subspaces - Emerging Ideas for Dimension Reduction in Parameter Studies , 2015, SIAM spotlights.

[18]  A. Samarov Exploring Regression Structure Using Nonparametric Functional Estimation , 1993 .

[19]  Thomas B. Schön,et al.  2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2015 , 2016 .

[20]  Joel A. Tropp,et al.  User-Friendly Tail Bounds for Sums of Random Matrices , 2010, Found. Comput. Math..

[21]  Saltelli Andrea,et al.  Global Sensitivity Analysis: The Primer , 2008 .

[22]  John L. Nazareth,et al.  Introduction to derivative-free optimization , 2010, Math. Comput..

[23]  D. Gleich,et al.  Computing active subspaces with Monte Carlo , 2014, 1408.0545.

[24]  Qiqi Wang,et al.  Erratum: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces , 2013, SIAM J. Sci. Comput..