Neural network ensemble with combination weights adaptive modulation

There exist numerous schemes and methods to construct neural network ensembles. The most common aggregation strategy in ensemble learning is the simple average. Furthermore, we might expect that an improvement can be achieved if there is a method by which we may weigh the individual members in the ensemble according to their individual performance during training stage. The idea behind this is to present an architecture that tries to approach this target. In this paper, a new training algorithm that utilizes a feedback mechanism to iteratively improve the classification capability of the ensemble is presented. This algorithm is called combination weights adaptive modulating (CWAM) algorithm. By CWAM, all the individual networks in the ensemble are trained simultaneously and interactively through the evolving combination weights. To evaluate the performance of CWAM, five well-known classification problems were used. Experimental studies have been carried out to show that the generalization ability of CWAM is much better than that of other neural network ensembles in almost all cases.

[1]  Jude W. Shavlik,et al.  Combining the Predictions of Multiple Classifiers: Using Competitive Learning to Initialize Neural Networks , 1995, IJCAI.

[2]  D. Opitz,et al.  Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..

[3]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[4]  Tamás D. Gedeon,et al.  Exploring constructive cascade networks , 1999, IEEE Trans. Neural Networks.

[5]  Bruce W. Schmeiser,et al.  Optimal linear combinations of neural networks: an overview , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[6]  Bruce E. Rosen,et al.  Ensemble Learning Using Decorrelated Neural Networks , 1996, Connect. Sci..

[7]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[8]  Mohamed S. Kamel,et al.  Weighted combination of neural network ensembles , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[9]  Dilip Sarkar,et al.  Randomness in generalization ability: a source to improve it , 1996, IEEE Trans. Neural Networks.

[10]  Lutz Prechelt,et al.  PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms , 1994 .

[11]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Xin Yao,et al.  A new evolutionary system for evolving artificial neural networks , 1997, IEEE Trans. Neural Networks.

[13]  Xin Yao,et al.  Simultaneous training of negatively correlated neural networks in an ensemble , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[14]  Xin Yao,et al.  A constructive algorithm for training cooperative neural network ensembles , 2003, IEEE Trans. Neural Networks.