Multiobjective Optimization of Ensembles of Multilayer Perceptrons for Pattern Classification

Pattern classification seeks to minimize error of unknown patterns, however, in many real world applications, type I (false positive) and type II (false negative) errors have to be dealt with separately, which is a complex problem since an attempt to minimize one of them usually makes the other grow. Actually, a type of error can be more important than the other, and a trade-off that minimizes the most important error type must be reached. Despite the importance of type-II errors, most pattern classification methods take into account only the global classification error. In this paper we propose to optimize both error types in classification by means of a multiobjective algorithm in which each error type and the network size is an objective of the fitness function. A modified version of the GProp method (optimization and design of multilayer perceptrons) is used, to simultaneously optimize the network size and the type I and II errors.

[1]  Stephen M. Smith Statistical scrotal effect , 1994, Nature.

[2]  J Savulescu,et al.  Are research ethics committees behaving unethically? Some suggestions for improving performance and accountability , 1996, BMJ.

[3]  Gary B. Lamont,et al.  Multiobjective Evolutionary Algorithms: Analyzing the State-of-the-Art , 2000, Evolutionary Computation.

[4]  Xin Yao,et al.  Ensemble learning via negative correlation , 1999, Neural Networks.

[5]  Juan Julián Merelo Guervós,et al.  Evolving Multilayer Perceptrons , 2000, Neural Processing Letters.

[6]  CremonesiPaolo Parallel, distributed and network-based processing , 2006 .

[7]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[8]  Hisao Ishibuchi,et al.  Evolutionary Multiobjective Optimization for Generating an Ensemble of Fuzzy Rule-Based Classifiers , 2003, GECCO.

[9]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[10]  Michelle D. Moore,et al.  Parallel genetic algorithm for search and constrained multi-objective optimization , 2004, 18th International Parallel and Distributed Processing Symposium, 2004. Proceedings..

[11]  Juan Julián Merelo Guervós,et al.  G-Prop: Global optimization of multilayer perceptrons using GAs , 2000, Neurocomputing.

[12]  E. Nieschlag,et al.  Antioxidant treatment of patients with asthenozoospermia or moderate oligoasthenozoospermia with high-dose vitamin C and vitamin E: a randomized, placebo-controlled, double-blind study. , 1999, Human reproduction.

[13]  Jonathan A C Sterne,et al.  Teaching hypothesis tests – time for significant change? , 2002, Statistics in medicine.

[14]  E. Altman The success of business failure prediction models: An international survey , 1984 .

[15]  Bernhard Sendhoff,et al.  Neural network regularization and ensembling using multi-objective evolutionary algorithms , 2004, Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753).

[16]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[17]  Lutz Prechelt,et al.  PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms , 1994 .

[18]  Hussein A. Abbass,et al.  A Memetic Pareto Evolutionary Approach to Artificial Neural Networks , 2001, Australian Joint Conference on Artificial Intelligence.

[19]  Amanda J. C. Sharkey,et al.  On Combining Artificial Neural Nets , 1996, Connect. Sci..

[20]  O. Mangasarian,et al.  Pattern Recognition Via Linear Programming: Theory and Application to Medical Diagnosis , 1989 .

[21]  Wei Tang,et al.  Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..

[22]  Xin Yao,et al.  Evolutionary ensembles with negative correlation learning , 2000, IEEE Trans. Evol. Comput..

[23]  Hussein A. Abbass Pareto Neuro-Ensembles , 2003, Australian Conference on Artificial Intelligence.

[24]  Hussein A. Abbass,et al.  Speeding Up Backpropagation Using Multiobjective Evolutionary Algorithms , 2003, Neural Computation.

[25]  R. Storn,et al.  Differential Evolution - A simple and efficient adaptive scheme for global optimization over continuous spaces , 2004 .

[26]  Johan Ölvander,et al.  Robustness considerations in multi-objective optimal design , 2005 .

[27]  Ignacio Rojas,et al.  Statistical analysis of the parameters of a neuro-genetic algorithm , 2002, IEEE Trans. Neural Networks.

[28]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[29]  Juan Julián Merelo Guervós,et al.  G-Prop-III: Global Optimization of Multilayer Perceptrons using an Evolutionary Algorithm , 1999, GECCO.

[30]  T C Chalmers,et al.  The importance of beta, the type II error and sample size in the design and interpretation of the randomized control trial. Survey of 71 "negative" trials. , 1978, The New England journal of medicine.

[31]  Thomas F. Coleman,et al.  Large-Scale Numerical Optimization , 1990 .

[32]  H. Abbass,et al.  PDE: a Pareto-frontier differential evolution approach for multi-objective optimization problems , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[33]  L. Cooper,et al.  When Networks Disagree: Ensemble Methods for Hybrid Neural Networks , 1992 .

[34]  F. de Toro,et al.  PSFGA: a parallel genetic algorithm for multiobjective optimization , 2002, Proceedings 10th Euromicro Workshop on Parallel, Distributed and Network-based Processing.

[35]  Carlos A. Coello Coello,et al.  Solving Multiobjective Optimization Problems Using an Artificial Immune System , 2005, Genetic Programming and Evolvable Machines.

[36]  César Hervás-Martínez,et al.  Cooperative coevolution of artificial neural network ensembles for pattern classification , 2005, IEEE Transactions on Evolutionary Computation.