Another related technique is the Bayesian Committee Support Vector Machine presented in

In this paper we propose the concept of coupling for ensemble learning. In the existing literature, all submodels that are considered within an ensemble are trained independently from each other. Here we study the effect of coupling the individual training processes within an ensemble of regularization networks. The considered coupling set gives the opportunity to work with a transductive set for both regression and classification problems. We discuss links between this coupled learning and multitask learning and explain how it can be interpreted as a form of group regularization. The methods are illustrated with experiments on classification and regression data sets.

[1]  Robert E. Schapire,et al.  A Brief Introduction to Boosting , 1999, IJCAI.

[2]  Samy Bengio,et al.  A Parallel Mixture of SVMs for Very Large Scale Problems , 2001, Neural Computation.

[3]  M. Pontil Leave-one-out error and stability of learning algorithms with applications , 2002 .

[4]  Johan A. K. Suykens,et al.  Intelligence and Cooperative Search by Coupled Local Minimizers , 2002, Int. J. Bifurc. Chaos.

[5]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[6]  J. Suykens,et al.  Ensemble Learning of Coupled Parmeterised Kernel Models , 2003 .

[7]  Rich Caruana,et al.  Multitask Learning , 1998, Encyclopedia of Machine Learning and Data Mining.

[8]  T Poggio,et al.  Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks , 1990, Science.

[9]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[10]  Thorsten Joachims,et al.  Transductive Inference for Text Classification using Support Vector Machines , 1999, ICML.

[11]  R. Shah,et al.  Least Squares Support Vector Machines , 2022 .

[12]  Anton Schwaighofer,et al.  The Bayesian Committee Support Vector Machine , 2001, ICANN.

[13]  Jianchang Mao,et al.  Scaling-up support vector machines using boosting algorithm , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[14]  D. Krige A statistical approach to some basic mine valuation problems on the Witwatersrand, by D.G. Krige, published in the Journal, December 1951 : introduction by the author , 1951 .

[15]  Tom Heskes,et al.  Clustering ensembles of neural network models , 2003, Neural Networks.

[16]  Shun-ichi Amari,et al.  On different ensembles of kernel machines , 2003, ESANN.

[17]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[18]  Tomaso A. Poggio,et al.  Regularization Networks and Support Vector Machines , 2000, Adv. Comput. Math..

[19]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[20]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[21]  Tomaso A. Poggio,et al.  Bounds on the Generalization Performance of Kernel Machine Ensembles , 2000, ICML.