A novel efficient two-phase algorithm for training interpolation radial basis function networks

Interpolation radial basis function (RBF) networks have been widely used in various applications. The output layer weights are usually determined by minimizing the sum-of-squares error or by directly solving interpolation equations. When the number of interpolation nodes is large, these methods are time consuming, difficult to control the balance between the convergence rate and the generality, and difficult to reach a high accuracy. In this paper, we propose a two-phase algorithm for training interpolation RBF networks with bell-shaped basis functions. In the first phase, the width parameters of basis functions are determined by taking into account the tradeoff between the error and the convergence rate. Then, the output layer weights are determined by finding the fixed point of a given contraction transformation. The running time of this new algorithm is relatively short and the balance between the convergence rate and the generality is easily controlled by adjusting the involved parameters, while the error is made as small as desired. Also, its running time can be further enhanced thanks to the possibility to parallelize the proposed algorithm. Finally, its efficiency is illustrated by simulations.

[1]  Friedhelm Schwenker,et al.  Three learning phases for radial-basis-function networks , 2001, Neural Networks.

[2]  D. Broomhead,et al.  Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks , 1988 .

[3]  Jooyoung Park,et al.  Approximation and Radial-Basis-Function Networks , 1993, Neural Computation.

[4]  David S. Broomhead,et al.  Multivariable Functional Interpolation and Adaptive Networks , 1988, Complex Syst..

[5]  C. Micchelli Interpolation of scattered data: Distance matrices and conditionally positive definite functions , 1986 .

[6]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[7]  Carl G. Looney,et al.  Pattern recognition using neural networks: theory and algorithms for engineers and scientists , 1997 .

[8]  L. Collatz Functional analysis and numerical mathematics , 1968 .

[9]  D. Signorini,et al.  Neural networks , 1995, The Lancet.

[10]  Paolo Frasconi,et al.  Learning without local minima in radial basis function networks , 1995, IEEE Trans. Neural Networks.

[11]  J. D. Powell,et al.  Radial basis function approximations to polynomials , 1989 .

[12]  James D. Keeler,et al.  Layered Neural Networks with Gaussian Hidden Units as Universal Approximations , 1990, Neural Computation.

[13]  Enrico Blanzieri,et al.  Theoretical Interpretations and Applications of Radial Basis Function Networks , 2003 .

[14]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[15]  M. Collins,et al.  Nursing Theories: The Base for Professional Nursing Practice, second ed., Julia B. George. Prentice-Hall, Inc, Englewood Cliffs, NJ 07632 (1985), 354, $15.95 paperback. , 1987 .