Spectrum Gaussian Processes Based On Tunable Basis Functions

Spectral approximation and variational inducing learning for the Gaussian process are two popular methods to reduce computational complexity. However, in previous research, those methods always tend to adopt the orthonormal basis functions, such as eigenvectors in the Hilbert space, in the spectrum method, or decoupled orthogonal components in the variational framework. In this paper, inspired by quantum physics, we introduce a novel basis function, which is tunable, local and bounded, to approximate the kernel function in the Gaussian process. There are two adjustable parameters in these functions, which control their orthogonality to each other and limit their boundedness. And we conduct extensive experiments on open-source datasets to testify its performance. Compared to several state-of-the-art methods, it turns out that the proposed method can obtain satisfactory or even better results, especially with poorly chosen kernel functions.

[1]  Arno Solin,et al.  Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features , 2019, AISTATS.

[2]  Aníbal R. Figueiras-Vidal,et al.  Inter-domain Gaussian Processes for Sparse Inference using Inducing Features , 2009, NIPS.

[3]  Sebastian Trimpe,et al.  Sliding Mode Control with Gaussian Process Regression for Underwater Robots , 2020, J. Intell. Robotic Syst..

[4]  Byron Boots,et al.  Orthogonally Decoupled Variational Gaussian Processes , 2018, NeurIPS.

[5]  Andriy Mnih,et al.  Sparse Orthogonal Variational Inference for Gaussian Processes , 2020, AISTATS.

[6]  Bernard C. Levy,et al.  Karhunen Loeve Expansion of Gaussian Processes , 2008 .

[7]  S. Longobardi,et al.  Predicting left ventricular contractile function via Gaussian process emulation in aortic-banded rats , 2020, Philosophical Transactions of the Royal Society A.

[8]  Aki Vehtari,et al.  GPstuff: Bayesian modeling with Gaussian processes , 2013, J. Mach. Learn. Res..

[9]  David M. Blei,et al.  Variational Inference: A Review for Statisticians , 2016, ArXiv.

[10]  HighWire Press Philosophical transactions of the Royal Society of London. Series A, Containing papers of a mathematical or physical character , 1896 .

[11]  J. Mercer Functions of Positive and Negative Type, and their Connection with the Theory of Integral Equations , 1909 .

[12]  I. M. Glazman,et al.  Theory of linear operators in Hilbert space , 1961 .

[13]  Shaobo Dang,et al.  Learn to Make Decision with Small Data for Autonomous Driving: Deep Gaussian Process and Feedback Control , 2020 .

[14]  Arno Solin,et al.  Hilbert space methods for reduced-rank Gaussian process regression , 2014, Stat. Comput..

[15]  Richard E. Turner,et al.  Improving the Gaussian Process Sparse Spectrum Approximation by Representing Uncertainty in Frequency Inputs , 2015, ICML.

[16]  G. Wahba Spline models for observational data , 1990 .

[17]  Manfred Opper,et al.  Finite-Dimensional Approximation of Gaussian Processes , 1998, NIPS.

[18]  Hassan Maatouk,et al.  Gaussian Process Emulators for Computer Experiments with Inequality Constraints , 2016, Mathematical Geosciences.

[19]  Andrew Gordon Wilson,et al.  Gaussian Process Kernels for Pattern Discovery and Extrapolation , 2013, ICML.

[20]  David Duvenaud,et al.  Automatic model construction with Gaussian processes , 2014 .

[21]  Mark van der Wilk,et al.  Variational Orthogonal Features , 2020, ArXiv.

[22]  Arno Solin,et al.  Variational Fourier Features for Gaussian Processes , 2016, J. Mach. Learn. Res..

[23]  Carl E. Rasmussen,et al.  A Unifying View of Sparse Approximate Gaussian Process Regression , 2005, J. Mach. Learn. Res..

[24]  Nicolas Durrande,et al.  Sparse Gaussian Processes with Spherical Harmonic Features , 2020, ICML.

[25]  Marc Peter Deisenroth,et al.  Efficiently sampling functions from Gaussian process posteriors , 2020, ICML.

[26]  G. Wahba Improper Priors, Spline Smoothing and the Problem of Guarding Against Model Errors in Regression , 1978 .

[27]  Makoto Yamada,et al.  Optimal Transport Kernels for Sequential and Parallel Neural Architecture Search , 2020, ArXiv.

[28]  Andrew Gordon Wilson,et al.  Function-Space Distributions over Kernels , 2019, NeurIPS.

[29]  Michalis K. Titsias,et al.  Variational Learning of Inducing Variables in Sparse Gaussian Processes , 2009, AISTATS.

[30]  Richard E. Turner,et al.  A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation , 2016, J. Mach. Learn. Res..

[31]  Hedvig Kjellström,et al.  Advances in Variational Inference , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[32]  Shaobo Dang,et al.  Sparse Gaussian Process Based On Hat Basis Functions , 2020, 2020 International Conference on Electrical, Communication, and Computer Engineering (ICECCE).

[33]  Carl E. Rasmussen,et al.  Sparse Spectrum Gaussian Process Regression , 2010, J. Mach. Learn. Res..

[34]  V. Balakrishnan,et al.  All about the dirac delta function(?) , 2003 .

[35]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[36]  S. Das Sarma,et al.  Analytically solvable driven time-dependent two-level quantum systems. , 2012, Physical review letters.

[37]  Olivier Roustant,et al.  Finite-dimensional Gaussian approximation with linear inequality constraints , 2017, SIAM/ASA J. Uncertain. Quantification.

[38]  Zoubin Ghahramani,et al.  Sparse Gaussian Processes using Pseudo-inputs , 2005, NIPS.

[39]  Carl E. Rasmussen,et al.  Understanding Probabilistic Sparse Gaussian Process Approximations , 2016, NIPS.

[40]  Skipper Seabold,et al.  Statsmodels: Econometric and Statistical Modeling with Python , 2010, SciPy.

[41]  Jarred Barber Sparse Gaussian Processes via Parametric Families of Compactly-supported Kernels , 2020, ArXiv.