A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation

This work presents a new sequential learning algorithm for radial basis function (RBF) networks referred to as generalized growing and pruning algorithm for RBF (GGAP-RBF). The paper first introduces the concept of significance for the hidden neurons and then uses it in the learning algorithm to realize parsimonious networks. The growing and pruning strategy of GGAP-RBF is based on linking the required learning accuracy with the significance of the nearest or intentionally added new neuron. Significance of a neuron is a measure of the average information content of that neuron. The GGAP-RBF algorithm can be used for any arbitrary sampling density for training samples and is derived from a rigorous statistical point of view. Simulation results for bench mark problems in the function approximation area show that the GGAP-RBF outperforms several other sequential learning algorithms in terms of learning speed, network size and generalization performance regardless of the sampling density function of the training data.

[1]  J. Platt Sequential Minimal Optimization : A Fast Algorithm for Training Support Vector Machines , 1998 .

[2]  Paramasivan Saratchandran,et al.  Performance evaluation of a sequential minimal radial basis function (RBF) neural network learning algorithm , 1998, IEEE Trans. Neural Networks.

[3]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[4]  Visakan Kadirkamanathan,et al.  A Function Estimation Approach to Sequential Learning with Neural Networks , 1993, Neural Computation.

[5]  Julio Ortega Lopera,et al.  Improved RAN sequential prediction using orthogonal techniques , 2001, Neurocomputing.

[6]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[7]  Panos J. Antsaklis,et al.  The dependence identification neural network construction algorithm , 1996, IEEE Trans. Neural Networks.

[8]  Alexander J. Smola,et al.  Support Vector Regression Machines , 1996, NIPS.

[9]  Nicolaos B. Karayiannis,et al.  Growing radial basis neural networks: merging supervised and unsupervised learning with network growth techniques , 1997, IEEE Trans. Neural Networks.

[10]  Narasimhan Sundararajan,et al.  Radial Basis Function Neural Networks With Sequential Learning: Mran and Its Applications , 1999 .

[11]  Sheng Chen,et al.  Regularized orthogonal least squares algorithm for constructing radial basis function networks , 1996 .

[12]  Chng Eng Siong,et al.  Gradient radial basis function networks for nonlinear and nonstationary time series prediction , 1996, IEEE Trans. Neural Networks.

[13]  Héctor Pomares,et al.  Time series analysis using normalized PG-RBF network with regression weights , 2002, Neurocomputing.

[14]  Sukhan Lee,et al.  A Gaussian potential function network with hierarchically self-organizing learning , 1991, Neural Networks.

[15]  Sethu Vijayakumar,et al.  Sequential Support Vector Classifiers and Regression , 1999, IIA/SOCO.

[16]  Mark J. L. Orr,et al.  Regularization in the Selection of Radial Basis Function Centers , 1995, Neural Computation.

[17]  B. Todorovic,et al.  Sequential growing and pruning of radial basis function network , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[18]  Y Lu,et al.  A Sequential Learning Scheme for Function Approximation Using Minimal Radial Basis Function Neural Networks , 1997, Neural Computation.

[19]  Adrian G. Bors,et al.  Minimal Topology for a Radial Basis Functions Neural Network for Pattern Classification , 1994 .