Orthogonal least squares learning algorithm for radial basis function networks

The radial basis function network offers a viable alternative to the two-layer neural network in many applications of signal processing. A common learning algorithm for radial basis function networks is based on first choosing randomly some data points as radial basis function centers and then using singular-value decomposition to solve for the weights of the network. Such a procedure has several drawbacks, and, in particular, an arbitrary selection of centers is clearly unsatisfactory. The authors propose an alternative learning procedure based on the orthogonal least-squares method. The procedure chooses radial basis function centers one by one in a rational way until an adequate network has been constructed. In the algorithm, each selected center maximizes the increment to the explained variance or energy of the desired output and does not suffer numerical ill-conditioning problems. The orthogonal least-squares learning strategy provides a simple and efficient means for fitting radial basis function networks. This is illustrated using examples taken from two different signal processing applications.

[1]  Å. Björck Solving linear least squares problems by Gram-Schmidt orthogonalization , 1967 .

[2]  T. Söderström On model structure testing in system identification , 1977 .

[3]  Torsten Bohlin,et al.  Maximum-power validation of models without higher-order fitting , 1978, Autom..

[4]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[5]  T. Rao,et al.  An Introduction to Bispectral Analysis and Bilinear Time Series Models , 1984 .

[6]  S. Qureshi,et al.  Adaptive equalization , 1982, Proceedings of the IEEE.

[7]  C. Micchelli Interpolation of scattered data: Distance matrices and conditionally positive definite functions , 1986 .

[8]  Nira Dyn,et al.  Interpolation of scattered Data by radial Functions , 1987, Topics in Multivariate Approximation.

[9]  M. J. D. Powell,et al.  Radial basis functions for multivariable interpolation: a review , 1987 .

[10]  I. J. Leontaritis,et al.  Model selection and validation methods for non-linear systems , 1987 .

[11]  D. Broomhead,et al.  Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks , 1988 .

[12]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[13]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[14]  Sheng Chen,et al.  Extended model set, global data and threshold model identification of severely non-linear systems , 1989 .

[15]  J. D. Powell,et al.  Radial basis function approximations to polynomials , 1989 .

[16]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .

[17]  Sheng Chen,et al.  Practical identification of NARMAX models using radial basis functions , 1990 .

[18]  Sheng Chen,et al.  Non-linear systems identification using radial basis functions , 1990 .

[19]  Sheng Chen,et al.  Reconstruction of binary signals using an adaptive radial-basis-function equalizer , 1991, Signal Process..

[20]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..