Bayesian transfer learning between Gaussian process regression tasks

Bayesian knowledge transfer in supervised learning scenarios often relies on a complete specification and optimization of the stochastic dependence between source and target tasks. This is a critical requirement of completely modelled settings, which can often be difficult to justify. We propose a strategy to overcome this. The methodology relies on fully probabilistic design to develop a target algorithm which accepts source knowledge in the form of a probability distribution. We present this incompletely modelled setting in the supervised learning context where the source and target tasks are to perform Gaussian process regression. Experimental evaluation demonstrates that the transfer of the source distribution substantially improves prediction performance of the target learner when recovering a distorted nonparametric function realization from noisy data.

[1]  Edwin V. Bonilla,et al.  Collaborative Multi-output Gaussian Processes , 2014, UAI.

[2]  Huaiyu Zhu On Information and Sufficiency , 1997 .

[3]  Andrew Gordon Wilson,et al.  Gaussian Process Regression Networks , 2011, ICML.

[4]  Jasper Snoek,et al.  Multi-Task Bayesian Optimization , 2013, NIPS.

[5]  Ayman Boustati,et al.  Multi-task Learning in Deep Gaussian Processes with Multi-kernel Layers , 2019, ArXiv.

[6]  Kai Li,et al.  Sparse Multi-Output Gaussian Processes for Medical Time Series Prediction , 2017 .

[7]  Anthony Quinn,et al.  Dynamic Bayesian Knowledge Transfer Between a Pair of Kalman Filters , 2018, 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP).

[8]  Sethu Vijayakumar,et al.  Multi-task Gaussian Process Learning of Robot Inverse Dynamics , 2008, NIPS.

[9]  Yu Zhang,et al.  A Survey on Multi-Task Learning , 2017, IEEE Transactions on Knowledge and Data Engineering.

[10]  Tatiana V. Guy,et al.  Optimal design of priors constrained by external predictors , 2017, Int. J. Approx. Reason..

[11]  Gavriel Salomon,et al.  T RANSFER OF LEARNING , 1992 .

[12]  Edwin V. Bonilla,et al.  Efficient Variational Inference for Gaussian Process Regression Networks , 2013, AISTATS.

[13]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[14]  Anthony Quinn,et al.  Knowledge Transfer in a Pair of Uniformly Modelled Bayesian Filters , 2019, ICINCO.

[15]  Anthony Quinn,et al.  Fully Probabilistic Design for Knowledge Transfer in a Pair of Kalman Filters , 2018, IEEE Signal Processing Letters.

[16]  Tatiana V. Guy,et al.  Fully probabilistic design of hierarchical Bayesian models , 2016, Inf. Sci..

[17]  Kian Ming Adam Chai Generalization Errors and Learning Curves for Regression with Multi-task Gaussian Processes , 2009, NIPS.

[18]  Peng Hao,et al.  Transfer learning using computational intelligence: A survey , 2015, Knowl. Based Syst..

[19]  Edwin V. Bonilla,et al.  Multi-task Gaussian Process Prediction , 2007, NIPS.

[20]  Hugh F. Durrant-Whyte,et al.  Non-stationary dependent Gaussian processes for data fusion in large-scale terrain modeling , 2011, 2011 IEEE International Conference on Robotics and Automation.

[21]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[22]  Miroslav Kárný,et al.  Towards fully probabilistic control design , 1996, Autom..

[23]  Neil D. Lawrence,et al.  Computationally Efficient Convolved Multiple Output Gaussian Processes , 2011, J. Mach. Learn. Res..

[24]  Kevin P. Murphy,et al.  Machine learning - a probabilistic perspective , 2012, Adaptive computation and machine learning series.

[25]  Yee Whye Teh,et al.  Semiparametric latent factor models , 2005, AISTATS.

[26]  Rodney W. Johnson,et al.  Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.

[27]  Miroslav Kárný,et al.  Axiomatisation of fully probabilistic design , 2012, Inf. Sci..

[28]  Milan Papež,et al.  Robust Bayesian Transfer Learning Between Kalman Filters , 2019, 2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP).