Efficient sparsification for Gaussian process regression

Abstract Sparse Gaussian process models provide an efficient way to perform regression on large data sets. Sparsification approaches deal with the selection of a representative subset of available training data for inducing the sparse model approximation. A variety of insertion and deletion criteria have been proposed, but they either lack accuracy or suffer from high computational costs. In this paper, we present a new and straightforward criterion for successive selection and deletion of training points in sparse Gaussian process regression. The proposed novel strategies for sparsification are as fast as the purely randomized schemes and, thus, appropriate for applications in online learning. Experiments on real-world robot data demonstrate that our obtained regression models are competitive with the computationally intensive state-of-the-art methods in terms of generalization and accuracy. Furthermore, we employ our approach in learning inverse dynamics models for compliant robot control using very large data sets, i.e. with half a million training points. In this experiment, it is also shown that our approximated sparse Gaussian process model is sufficiently fast for real-time prediction in robot control.

[1]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[2]  Alexander J. Smola,et al.  Sparse Greedy Gaussian Process Regression , 2000, NIPS.

[3]  Carl E. Rasmussen,et al.  A Unifying View of Sparse Approximate Gaussian Process Regression , 2005, J. Mach. Learn. Res..

[4]  Bernhard Schölkopf,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.

[5]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[6]  Duy Nguyen-Tuong,et al.  Computed torque control with nonparametric regression models , 2008, 2008 American Control Conference.

[7]  Joaquin Quiñonero-Candela,et al.  Learning with Uncertainty: Gaussian Processes and Relevance Vector Machines , 2004 .

[8]  Marc Toussaint,et al.  Fast greedy insertion and deletion in sparse Gaussian process regression , 2015, ESANN.

[9]  Ashok Srivastava,et al.  Stable and Efficient Gaussian Process Calculations , 2009, J. Mach. Learn. Res..

[10]  Matthias W. Seeger,et al.  Using the Nyström Method to Speed Up Kernel Machines , 2000, NIPS.

[11]  Zoubin Ghahramani,et al.  Sparse Gaussian Processes using Pseudo-inputs , 2005, NIPS.

[12]  Mark W. Spong,et al.  Robot dynamics and control , 1989 .

[13]  Neil D. Lawrence,et al.  Fast Forward Selection to Speed Up Sparse Gaussian Process Regression , 2003, AISTATS.

[14]  Michalis K. Titsias,et al.  Variational Learning of Inducing Variables in Sparse Gaussian Processes , 2009, AISTATS.

[15]  Carl E. Rasmussen,et al.  Sparse Spectrum Gaussian Process Regression , 2010, J. Mach. Learn. Res..

[16]  Cristian Sminchisescu,et al.  Fourier Kernel Learning , 2012, ECCV.

[17]  Wei Chu,et al.  A matching pursuit approach to sparse Gaussian process regression , 2005, NIPS.

[18]  Neil D. Lawrence,et al.  Deep Gaussian Processes , 2012, AISTATS.

[19]  Duy Nguyen-Tuong,et al.  Local Gaussian Process Regression for Real Time Online Model Learning , 2008, NIPS.

[20]  Matthias W. Seeger,et al.  Bayesian Gaussian process models : PAC-Bayesian generalisation error bounds and sparse approximations , 2003 .

[21]  Peter Englert,et al.  Sparse Gaussian process regression for compliant, real-time robot control , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[22]  Jan Peters,et al.  Incremental Sparsification for Real-time Online Model Learning , 2010, AISTATS.

[23]  L. Csató Gaussian processes:iterative sparse approximations , 2002 .

[24]  Ben Taskar,et al.  Expectation Maximization and Posterior Constraints , 2007, NIPS.