Generalized Product of Experts for Automatic and Principled Fusion of Gaussian Process Predictions

In this work, we propose a generalized product of experts (gPoE) framework for combining the predictions of multiple probabilistic models. We identify four desirable properties that are important for scalability, expressiveness and robustness, when learning and inferring with a combination of multiple models. Through analysis and experiments, we show that gPoE of Gaussian processes (GP) have these qualities, while no other existing combination schemes satisfy all of them at the same time. The resulting GP-gPoE is highly scalable as individual GP experts can be independently learned in parallel; very expressive as the way experts are combined depends on the input rather than fixed; the combined prediction is still a valid probabilistic model with natural interpretation; and finally robust to unreliable predictions from individual experts.

[1]  David H. Wolpert,et al.  Stacked generalization , 1992, Neural Networks.

[2]  Geoffrey E. Hinton Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.

[3]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[4]  Zoubin Ghahramani,et al.  Sparse Gaussian Processes using Pseudo-inputs , 2005, NIPS.

[5]  Raquel Urtasun Sotil Motion models for robust 3D human body tracking , 2006 .

[6]  Stephen M. Omohundro,et al.  Five Balltree Construction Algorithms , 2009 .

[7]  David J. Fleet,et al.  Efficient Optimization for Sparse Gaussian Process Regression , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Neil D. Lawrence,et al.  Gaussian Processes for Big Data , 2013, UAI.

[9]  David J. Fleet,et al.  Efficient Optimization for Sparse Gaussian Process Regression , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.