Pareto Neuro-Ensembles

The formation of a neural network ensemble has attracted much attention in the machine learning literature. A set of trained neural networks are combined using a post-gate to form a single super-network. One main challenge in the literature is to decide on which network to include in, or exclude from the ensemble. Another challenge is how to define an optimum size for the ensemble. Some researchers also claim that for an ensemble to be effective, the networks need to be different. However, there is not a consistent definition of what “different” means. Some take it to mean weakly correlated networks, networks with different bias-variance trade-off, and/or networks which are specialized on different parts of the input space.

[1]  Hussein A. Abbass,et al.  An evolutionary artificial neural networks approach for breast cancer diagnosis , 2002, Artif. Intell. Medicine.

[2]  David J. Spiegelhalter,et al.  Machine Learning, Neural and Statistical Classification , 2009 .

[3]  Xin Yao,et al.  Ensemble learning via negative correlation , 1999, Neural Networks.

[4]  Gerald W. Kimble,et al.  Information and Computer Science , 1975 .

[5]  R. Storn,et al.  Differential Evolution - A simple and efficient adaptive scheme for global optimization over continuous spaces , 2004 .

[6]  H. Abbass,et al.  PDE: a Pareto-frontier differential evolution approach for multi-objective optimization problems , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[7]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[8]  Xin Yao,et al.  Evolutionary ensembles with negative correlation learning , 2000, IEEE Trans. Evol. Comput..

[9]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[10]  H. Abbass The self-adaptive Pareto differential evolution algorithm , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[11]  Hussein A. Abbass,et al.  A Memetic Pareto Evolutionary Approach to Artificial Neural Networks , 2001, Australian Joint Conference on Artificial Intelligence.

[12]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[13]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[14]  Xin Yao,et al.  Simultaneous training of negatively correlated neural networks in an ensemble , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[15]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[16]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[17]  Huan Liu,et al.  Book review: Machine Learning, Neural and Statistical Classification Edited by D. Michie, D.J. Spiegelhalter and C.C. Taylor (Ellis Horwood Limited, 1994) , 1996, SGAR.

[18]  Kalyanmoy Deb,et al.  Multi-objective optimization using evolutionary algorithms , 2001, Wiley-Interscience series in systems and optimization.

[19]  Amanda J. C. Sharkey,et al.  On Combining Artificial Neural Nets , 1996, Connect. Sci..

[20]  Hussein A. Abbass,et al.  Speeding Up Backpropagation Using Multiobjective Evolutionary Algorithms , 2003, Neural Computation.