A K-means-based Multi-subpopulation Particle Swarm Optimization for Neural Network Ensemble

This paper presents a k-means-based multi-subpopulation particle swarm optimization, denoted as KMPSO, for training the neural network ensemble. In the proposed KMPSO, particles are dynamically partitioned into clusters via the k-means clustering algorithm at every iteration, and each of the resulting clusters is responsible for training a component neural network. The performance of the KMPSO has been evaluated on several benchmark problems. Our results show that the proposed method can effectively control the trade-off between the diversity and accuracy in the ensemble, thus achieving competitive results in comparison with related algorithms.

[1]  Bin Sheng,et al.  Colorization Using Neural Network Ensemble , 2017, IEEE Transactions on Image Processing.

[2]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Hussein A. Abbass,et al.  Pareto neuro-evolution: constructing ensemble of neural networks using multi-objective optimization , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[4]  Gunnar Rätsch,et al.  An Improvement of AdaBoost to Avoid Overfitting , 1998, ICONIP.

[5]  Xin Yao,et al.  DIVACE: Diverse and Accurate Ensemble Learning Algorithm , 2004, IDEAL.

[6]  Anders Krogh,et al.  Learning with ensembles: How overfitting can be useful , 1995, NIPS.

[7]  Yurong Liu,et al.  A niching evolutionary algorithm with adaptive negative correlation learning for neural network ensemble , 2017, Neurocomputing.

[8]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[9]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[10]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[11]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[12]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[13]  Wei Tang,et al.  Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..

[14]  David W. Opitz,et al.  Actively Searching for an E(cid:11)ective Neural-Network Ensemble , 1996 .

[15]  David W. Opitz,et al.  Generating Accurate and Diverse Members of a Neural-Network Ensemble , 1995, NIPS.

[16]  Xin Yao,et al.  Ensemble learning via negative correlation , 1999, Neural Networks.

[17]  Ka-Chun Wong,et al.  A Short Survey on Data Clustering Algorithms , 2015, 2015 Second International Conference on Soft Computing and Machine Intelligence (ISCMI).

[18]  Xin Yao,et al.  Evolutionary ensembles with negative correlation learning , 2000, IEEE Trans. Evol. Comput..