Pruning Neural Networks with Distribution Estimation Algorithms

This paper describes the application of four evolutionary algorithms to the pruning of neural networks used in classification problems. Besides of a simple genetic algorithm (GA), the paper considers three distribution estimation algorithms (DEAs): a compact GA, an extended compact GA, and the Bayesian Optimization Algorithm. The objective is to determine if the DEAs present advantages over the simple GA in terms of accuracy or speed in this problem. The experiments considered a feedforward neural network trained with standard backpropagation and 15 public-domain and artificial data sets. In most cases, the pruned networks seemed to have better or equal accuracy than the original fully-connected networks. We found few differences in the accuracy of the networks pruned by the four EAs, but found large differences in the execution time. The results suggest that a simple GA with a small population might be the best algorithm for pruning networks on the data sets we tested.

[1]  Heinz Mühlenbein,et al.  FDA -A Scalable Evolutionary Algorithm for the Optimization of Additively Decomposed Functions , 1999, Evolutionary Computation.

[2]  Dirk Thierens,et al.  Scalability Problems of Simple Genetic Algorithms , 1999, Evolutionary Computation.

[3]  H. M. Rwcp The equation for the response to selection and its use for prediction , 1997 .

[4]  Blake LeBaron,et al.  An Evolutionary Bootstrap Approach to Neural Network Pruning and Generalization , 1997 .

[5]  D. Goldberg,et al.  BOA: the Bayesian optimization algorithm , 1999 .

[6]  David E. Goldberg,et al.  The compact genetic algorithm , 1999, IEEE Trans. Evol. Comput..

[7]  L. Darrell Whitley,et al.  Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..

[8]  Martin Schmidt,et al.  Using GA to Train NN Using Sharing and Pruning , 1995, SCAI.

[9]  Pedro A. Castillo,et al.  Artificial Neural Networks Design using Evolutionary Algorithms , 2003 .

[10]  David E. Goldberg,et al.  A Survey of Optimization by Building and Using Probabilistic Models , 2002, Comput. Optim. Appl..

[11]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[12]  Shumeet Baluja,et al.  A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning , 1994 .

[13]  Pedro Larrañaga,et al.  Feature Subset Selection by Bayesian network-based optimization , 2000, Artif. Intell..

[14]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[15]  Ethem Alpaydın,et al.  Combined 5 x 2 cv F Test for Comparing Supervised Classification Learning Algorithms , 1999, Neural Comput..

[16]  James Kennedy,et al.  Proceedings of the 1998 IEEE International Conference on Evolutionary Computation [Book Review] , 1999, IEEE Transactions on Evolutionary Computation.

[17]  Fernando G. Lobo,et al.  Extended Compact Genetic Algorithm in C , 1999 .

[18]  G. Harik Linkage Learning via Probabilistic Modeling in the ECGA , 1999 .

[19]  Martin Pelikan A Simple Implementation of the Bayesian Optimization Algorithm (BOA) in C++ (version 1.0) , 1999 .

[20]  Peter J. B. Hancock,et al.  Pruning Neural Nets by Genetic Algorithm , 1992 .

[21]  Wei-Yin Loh,et al.  A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-Three Old and New Classification Algorithms , 2000, Machine Learning.

[22]  Russell Reed,et al.  Pruning algorithms-a survey , 1993, IEEE Trans. Neural Networks.