A Novel Dynamic Weight Neural Network Ensemble Model

Neural network is easy to fall into the minimum and overfitting in the application. The paper proposes a novel dynamic weight neural network ensemble model (DW-NNE). The Bagging algorithm generates certain neural network individuals which then are selected by the K-means clustering algorithm. In order to solve the problem that K-value cannot be selected automatically in the K-means clustering algorithm when conducting the selection of individuals, the K-value optimization algorithm based on distance cost function is put forward to find the optimal K-values. In addition, for the integrated output problems, the paper proposes a dynamic weight model which is based on fuzzy neural network with accordance to the ideas of dynamic weight. The experimental results show that the integrated approach can achieve better prediction accuracy compared to the traditional single model and neural network ensemble model.

[1]  Min Han,et al.  Remote sensing image classification based on neural network ensemble algorithm , 2012, Neurocomputing.

[2]  Wu Jian-shengb Study on the Stock Market Prediction Model of Neural Ensemble Based on Genetic Algorithms , 2007 .

[3]  Kin Keung Lai,et al.  Multistage RBF neural network ensemble learning for exchange rates forecasting , 2008, Neurocomputing.

[4]  Yu-Bin Yang,et al.  Lung cancer cell identification based on artificial neural network ensembles , 2002, Artif. Intell. Medicine.

[5]  Yun Zhang,et al.  Design of ensemble neural network using the Akaike information criterion , 2008, Eng. Appl. Artif. Intell..

[6]  William B. Yates,et al.  Use of methodological diversity to improve neural network generalisation , 2005, Neural Computing & Applications.

[7]  Yun Zhang,et al.  Design of ensemble neural network using entropy theory , 2011, Adv. Eng. Softw..

[8]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[9]  Fabio Roli,et al.  Design of effective neural network ensembles for image classification purposes , 2001, Image Vis. Comput..

[10]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  David W. Opitz,et al.  Actively Searching for an E(cid:11)ective Neural-Network Ensemble , 1996 .

[12]  Pan Ruo-yu,et al.  Optimization Study on k Value of K-means Algorithm , 2006 .

[13]  Zhang-Quan Shen,et al.  Dynamically weighted ensemble neural networks for regression problems , 2004, Proceedings of 2004 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.04EX826).

[14]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[15]  Anders Krogh,et al.  Learning with ensembles: How overfitting can be useful , 1995, NIPS.

[16]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1990, COLT '90.

[17]  Wei Tang,et al.  Corrigendum to "Ensembling neural networks: Many could be better than all" [Artificial Intelligence 137 (1-2) (2002) 239-263] , 2010, Artif. Intell..

[18]  Hou-Kuan Huang,et al.  A selective approach to neural network ensemble based on clustering technology , 2004, Proceedings of 2004 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.04EX826).

[19]  Xin Yao,et al.  Simultaneous training of negatively correlated neural networks in an ensemble , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[20]  Thomas G. Dietterich Machine-Learning Research Four Current Directions , 1997 .