A complexity based Silent Pruning Algorithm

This paper presents a Lempel Ziv Complexity (LZC) based pruning algorithm, called Silent Pruning Algorithm (SPA), for designing artificial neural networks (ANNs). This algorithm prunes hidden neurons during the training process of ANNs according to their ranks computed with LZC. LZC extracts the number of unique patterns in a time sequence as a measure of rank. As a result, it is expected that LZC computes idleness or activeness of a hidden unit in training. The pruning is done on hidden nodes of three layered feed-forward neural network based on lowest LZC (lowest rank) of hidden neurons. This algorithm is similar to yet different from other so far existing pruning algorithms. One of the sub goals of this method is to quantify the complexity and topology of ANN. The SPA not only prunes hidden units but also facilitates to maintain complexity. The proposed SPA has been tested on a number of challenging benchmark problems in machine learning and ANNs, including Cancer, Diabetes, Heart disease, Card, and Glass identification problems. In order to justify the effectiveness of SPA we have developed a method which prunes hidden units randomly. In addition SPA is compared with variance analysis of sensitivity information based pruning method. The experimental results show that SPA can design compact ANN architectures with good generalization ability. The results also show that facilitating complexity and suppressing simplicity help the training to produce resourceful and functional network.

[1]  Yoshio Hirose,et al.  Backpropagation algorithm which varies the number of hidden units , 1989, International 1989 Joint Conference on Neural Networks.

[2]  Lutz Prechelt,et al.  PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms , 1994 .

[3]  Md. Monirul Islam,et al.  An algorithm for automatic design of two hidden layered artificial neural networks , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[4]  Kazuyuki Murase,et al.  A New Algorithm to Design Multiple Hidden Layered Artificial Neural Networks , 2006 .

[5]  Xin Yao,et al.  A Preliminary Study on Designing Artiicial Neural Networks Using Co-evolution , 1995 .

[6]  Andries Petrus Engelbrecht,et al.  A new pruning heuristic based on variance analysis of sensitivity information , 2001, IEEE Trans. Neural Networks.

[7]  M. Mandischer Evolving recurrent neural networks with non-binary encoding , 1995, Proceedings of 1995 IEEE International Conference on Evolutionary Computation.

[8]  G. Edelman,et al.  A measure for brain complexity: relating functional segregation and integration in the nervous system. , 1994, Proceedings of the National Academy of Sciences of the United States of America.

[9]  Rudy Setiono,et al.  Use of a quasi-Newton method in a feedforward neural network construction algorithm , 1995, IEEE Trans. Neural Networks.

[10]  Abraham Lempel,et al.  On the Complexity of Finite Sequences , 1976, IEEE Trans. Inf. Theory.

[11]  Jean-Pierre Nadal,et al.  Study of a Growth Algorithm for a Feedforward Network , 1989, Int. J. Neural Syst..

[12]  James T. Kwok,et al.  Constructive algorithms for structure learning in feedforward neural networks for regression problems , 1997, IEEE Trans. Neural Networks.

[13]  Mike Wynne-Jones,et al.  Node splitting: A constructive algorithm for feed-forward neural networks , 1991, Neural Computing & Applications.

[14]  Kazuyuki Murase,et al.  A new algorithm to design compact two-hidden-layer artificial neural networks , 2001, Neural Networks.

[15]  F. Heimes,et al.  Traditional and evolved dynamic neural networks for aircraft simulation , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.

[16]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[17]  Alfred Jean Philippe Lauret,et al.  A node pruning algorithm based on a Fourier amplitude sensitivity test method , 2006, IEEE Transactions on Neural Networks.

[18]  Neil Burgess,et al.  A Constructive Algorithm that Converges for Real-Valued Input Patterns , 1994, Int. J. Neural Syst..

[19]  Timur Ash,et al.  Dynamic node creation in backpropagation networks , 1989 .

[20]  Xin Yao,et al.  A New Adaptive Merging and Growing Algorithm for Designing Artificial Neural Networks , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[21]  Lutz Prechelt,et al.  Automatic early stopping using cross validation: quantifying the criteria , 1998, Neural Networks.

[22]  Russell Reed,et al.  Pruning algorithms-a survey , 1993, IEEE Trans. Neural Networks.

[23]  Steven Young,et al.  CARVE — A Constructive Algorithm for Real Valued Examples , 1994 .

[24]  Andries Petrus Engelbrecht,et al.  Variance analysis of sensitivity information for pruning multilayer feedforward neural networks , 1999, IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339).

[25]  Barry P. Haynes,et al.  Pruning Artificial Neural Networks Using Neural Complexity Measures , 2008, Int. J. Neural Syst..

[26]  L. Darrell Whitley,et al.  Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..

[27]  Jan Torreele,et al.  Temporal Processing with Recurrent Networks: An Evolutionary Approach , 1991, ICGA.

[28]  Darrell Whitley,et al.  Optimizing small neural networks using a distributed genetic algorithm , 1990 .

[29]  Léon Personnaz,et al.  Neural-network construction and selection in nonlinear modeling , 2003, IEEE Trans. Neural Networks.