Particle swarm optimization based multi-prototype ensembles

This paper proposes and evaluates a Particle Swarm Optimization (PSO) based ensemble classifier. The members of the ensemble are Nearest Prototype Classifiers generated sequentially using PSO and combined by a majority voting mechanism. Two necessary requirements for good performance of an ensemble are accuracy and diversity of error. Accuracy is achieved by PSO minimizing a fitness function representing the error rate as the members are created. The diversity of error is promoted by using a different initialization of PSO each time to create a new member and by adopting decorrelated training where a penalty term is added to the fitness function to penalize particles that make the same errors as previously generated classifiers. Simulation experiments on different classification problems show that the ensemble has better performance than a single classifier and are effective in generating diverse ensemble members.

[1]  Jian Wang,et al.  An Improved Particle Swarm Optimization Algorithm , 2011 .

[2]  Sandip Sen,et al.  PLEASE: A Prototype Learning System Using Genetic Algorithms , 1995, ICGA.

[3]  Bruce E. Rosen,et al.  Ensemble Learning Using Decorrelated Neural Networks , 1996, Connect. Sci..

[4]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[6]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[7]  Oleksandr Makeyev,et al.  Neural network with ensembles , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[8]  Ching Y. Suen,et al.  Application of majority voting to pattern recognition: an analysis of its behavior and performance , 1997, IEEE Trans. Syst. Man Cybern. Part A.

[9]  Gavin Brown,et al.  The Use of the Ambiguity Decomposition in Neural Network Ensemble Learning Methods , 2003, ICML.

[10]  Ron Kohavi,et al.  A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection , 1995, IJCAI.

[11]  Fernando Fernández,et al.  Evolutionary Design of Nearest Prototype Classifiers , 2004, J. Heuristics.

[12]  Ivanoe De Falco,et al.  Facing classification problems with Particle Swarm Optimization , 2007, Appl. Soft Comput..

[13]  Jianxin Wu,et al.  Genetic Algorithm based Selective Neural Network Ensemble , 2001, IJCAI.

[14]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[15]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  D. Opitz,et al.  Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..

[17]  D. Bunn,et al.  Statistical efficiency in the linear combination of forecasts , 1985 .

[18]  Stephen F. Smith,et al.  A learning system based on genetic adaptive algorithms , 1980 .

[19]  John H. Holland,et al.  Escaping brittleness: the possibilities of general-purpose learning algorithms applied to parallel rule-based systems , 1995 .

[20]  Lakhmi C. Jain,et al.  Designing classifier fusion systems by genetic algorithms , 2000, IEEE Trans. Evol. Comput..

[21]  William F. Punch,et al.  Optimizing Classification Ensembles via a Genetic Algorithm for a Web-Based Educational System , 2004, SSPR/SPR.

[22]  Xin Yao,et al.  Ensemble learning via negative correlation , 1999, Neural Networks.

[23]  Xin Yao,et al.  Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.

[24]  M. Clerc,et al.  The swarm and the queen: towards a deterministic and adaptive particle swarm optimization , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[25]  Hyun-Chul Kim,et al.  Constructing support vector machine ensemble , 2003, Pattern Recognit..

[26]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[27]  Javier Trejos-Zelaya,et al.  Partitioning by Particle Swarm Optimization , 2007 .

[28]  Lawrence O. Hall,et al.  A Comparison of Decision Tree Ensemble Creation Techniques , 2007 .

[29]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[30]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[31]  Xin Yao,et al.  Making use of population information in evolutionary artificial neural networks , 1998, IEEE Trans. Syst. Man Cybern. Part B.

[32]  David W. Opitz,et al.  A Genetic Algorithm Approach for Creating Neural-Network Ensembles , 1999 .

[33]  Sherif Hashem,et al.  Optimal Linear Combinations of Neural Networks , 1997, Neural Networks.

[34]  Derek W. Bunn,et al.  Forecasting with more than one model , 1989 .

[35]  Inés María Galván,et al.  Building Nearest Prototype Classifiers Using a Michigan Approach PSO , 2007, 2007 IEEE Swarm Intelligence Symposium.

[36]  Ji-Xin Qian,et al.  An improved particle swarm optimization algorithm with neighborhoods topologies , 2004, Proceedings of 2004 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.04EX826).

[37]  Christian Gagné,et al.  Ensemble learning for free with evolutionary algorithms? , 2007, GECCO '07.

[38]  James C. Bezdek,et al.  Multiple-prototype classifier design , 1998, IEEE Trans. Syst. Man Cybern. Part C.