Boost-wise pre-loaded mixture of experts for classification tasks

A modified version of Boosted Mixture of Experts (BME) is presented in this paper. While previous related works, namely BME, attempt to improve the performance by incorporating complementary features of a hybrid combining framework, they have some drawback. Analyzing the problems of previous approaches has suggested several modifications that have led us to propose a new method called Boost-wise Pre-loaded Mixture of Experts (BPME). We present a modification in pre-loading (initialization) procedure of ME, which addresses previous problems and overcomes them by employing a two-stage pre-loading procedure. In this approach, both the error and confidence measures are used as the difficulty criteria in boost-wise partitioning of problem space.

[1]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[2]  Zhan Yu,et al.  Evolutionary fusion of a multi-classifier system for efficient face recognition , 2009 .

[3]  Nathan Intrator,et al.  Boosted Mixture of Experts: An Ensemble Learning Scheme , 1999, Neural Computation.

[4]  Der-Cherng Liaw,et al.  Design of midcourse guidance laws via a combination of fuzzy and SMC approaches , 2010 .

[5]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[6]  D. Obradovic,et al.  Combining Artificial Neural Nets , 1999, Perspectives in Neural Computing.

[7]  R. Polikar,et al.  Bootstrap - Inspired Techniques in Computation Intelligence , 2007, IEEE Signal Processing Magazine.

[8]  Malcolm I. Heywood,et al.  Input partitioning to mixture of experts , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[9]  Jakob Vogdrup Hansen,et al.  Combining Predictors: Comparison of Five Meta Machine Learning Methods , 1999, Inf. Sci..

[10]  Ludmila I. Kuncheva,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2004 .

[11]  Kagan Tumer,et al.  Error Correlation and Error Reduction in Ensemble Classifiers , 1996, Connect. Sci..

[12]  R. Schapire The Strength of Weak Learnability , 1990, Machine Learning.

[13]  Xin Yao,et al.  Bagging and Boosting Negatively Correlated Neural Networks , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[14]  Reza Ebrahimpour,et al.  Teacher-directed learning in view-independent face recognition with mixture of experts using overlapping eigenspaces , 2008, Comput. Vis. Image Underst..

[15]  Robert A. Jacobs,et al.  Bias/Variance Analyses of Mixtures-of-Experts Architectures , 1997, Neural Computation.

[16]  Xin Yao,et al.  Simultaneous training of negatively correlated neural networks in an ensemble , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[17]  Derek Partridge,et al.  Diversity between Neural Networks and Decision Trees for Building Multiple Classifier Systems , 2000, Multiple Classifier Systems.

[18]  Michael I. Jordan,et al.  Task Decomposition Through Competition in a Modular Connectionist Architecture: The What and Where Vision Tasks , 1990, Cogn. Sci..

[19]  Benalla Hocine,et al.  New direct torque neuro-fuzzy control based SVM-three level inverter-fed induction motor , 2010 .

[20]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[21]  Il Hong Suh,et al.  SSPQL: Stochastic shortest path-based Q-learning , 2011 .

[22]  Steve R. Waterhouse,et al.  Ensemble Methods for Phoneme Classification , 1996, NIPS.

[23]  Reza Ebrahimpour,et al.  Low resolution face recognition using Mixture of Experts with different representations , 2011, 2011 International Conference of Soft Computing and Pattern Recognition (SoCPaR).

[24]  Reza Ebrahimpour,et al.  Low resolution face recognition using combination of diverse classifiers , 2010, 2010 International Conference of Soft Computing and Pattern Recognition.