Radial basis function (RBF) networks are popularly applied to regression problems. However, several issues are encountered in the applications, among them are the number of hidden nodes and the settings of the basis functions. In this paper, we propose a novel RBF network approach for regression estimation applications. A self-constructing clustering algorithm is used to determine the number of hidden nodes in the hidden layer. Normalized Gaussian functions are taken to be basis functions, and their centers and deviations are set according to the data distribution of the formed clusters. To learn optimal values of the parameters associated with the network, the least squares method and steepest descent backpropagation are adopted. With the proposed approach, the number of hidden nodes is determined automatically, and the initial settings of the basis functions are obtained. Furthermore, through the incorporation of adaptive deviations, data in the training set can be expressed more suitably than by standard basis functions. Experimental results show that the proposed approach is effective.
[1]
Bernhard Schölkopf,et al.
A tutorial on support vector regression
,
2004,
Stat. Comput..
[2]
J. Mark.
Introduction to radial basis function networks
,
1996
.
[3]
Gene H. Golub,et al.
Matrix computations
,
1983
.
[4]
Shie-Jue Lee,et al.
A neuro-fuzzy system modeling with self-constructing rule generationand hybrid SVD-based learning
,
2003,
IEEE Trans. Fuzzy Syst..
[5]
Johan A. K. Suykens,et al.
Least Squares Support Vector Machine Classifiers
,
1999,
Neural Processing Letters.
[6]
Simon Haykin,et al.
Neural Networks: A Comprehensive Foundation
,
1998
.
[7]
Johan A. K. Suykens,et al.
Least Squares Support Vector Machines
,
2002
.
[8]
Aristidis Likas,et al.
Shared kernel models for class conditional density estimation
,
2001,
IEEE Trans. Neural Networks.