STATISTICAL CONTROL OF GROWING AND PRUNING IN RBF-LIKE NEURAL NETWORKS

Abstract: Incremental Net Pro (IncNet Pro) with local learning feature and statistically controlled growing and pruning of the network is introduced. The architecture of the net is basedon RBF networks. Extended Kalman Filter algorithm and its new fast version is proposed and used as learning algorithm. IncNet Pro is similar to theResource Allocation Network described by Platt in the main idea of the growing. The statistical novel criterion is used to determine the growing point. IncNet Pro use pruning method similar toOptimal Brain Surgeon by Hassibi, but based on Extended Kalman Filter algorithm. The Bi-radial functions are used instead of radial basis functions to obtain more flexible network.

[1]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[2]  Wlodzislaw Duch,et al.  Feature space mapping as a universal adaptive system , 1995 .

[3]  Yann LeCun,et al.  Optimal Brain Damage , 1989, NIPS.

[4]  M. J. D. Powell,et al.  Radial basis functions for multivariable interpolation: a review , 1987 .

[5]  Yoshiki Uchikawa,et al.  On fuzzy modeling using fuzzy neural networks with the back-propagation algorithm , 1992, IEEE Trans. Neural Networks.

[6]  Tadashi Kondo,et al.  Revised GMDH Algorithm Estimating Degree of the Complete Polynomial , 1986 .

[7]  Visakan Kadirkamanathan A statistical inference based growth criterion for the RBF network , 1994, Proceedings of IEEE Workshop on Neural Networks for Signal Processing.

[8]  Visakan Kadirkamanathan,et al.  A Function Estimation Approach to Sequential Learning with Neural Networks , 1993, Neural Computation.

[9]  Chris Bishop,et al.  Improving the Generalization Properties of Radial Basis Function Neural Networks , 1991, Neural Computation.

[10]  Benjamin W. Wah,et al.  Global Optimization for Neural Network Training , 1996, Computer.

[11]  M. Sugeno,et al.  Structure identification of fuzzy model , 1988 .

[12]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[13]  D. Lowe,et al.  Adaptive radial basis function nonlinearities, and the problem of generalisation , 1989 .

[14]  David E. Rumelhart,et al.  BACK-PROPAGATION, WEIGHT-ELIMINATION AND TIME SERIES PREDICTION , 1991 .

[15]  F. Girosi,et al.  From regularization to radial, tensor and additive splines , 1993, Neural Networks for Signal Processing III - Proceedings of the 1993 IEEE-SP Workshop.

[16]  E. Fiesler,et al.  Comparative Bibliography of Ontogenic Neural Networks , 1994 .

[17]  Y Lu,et al.  A Sequential Learning Scheme for Function Approximation Using Minimal Radial Basis Function Neural Networks , 1997, Neural Computation.

[18]  Babak Hassibi,et al.  Second Order Derivatives for Network Pruning: Optimal Brain Surgeon , 1992, NIPS.

[19]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.