GA with orthogonal transformation for RBFN configuration

A genetic algorithm (GA) approach for radial basis function network (RBFN) training is proposed. Although several non-evolutionary training methods for RBFN training are available, orthogonal least squares (OLS) algorithm has a number of desirable properties for robust training. In the OLS algorithm the basis function centers are selected among the training samples and therefore they are predetermined in their totality. In contrast with this, the width parameters of the basis functions are not and in general cannot accurately be predetermined. Therefore they are generally taken to be equal for all centers. In the GA approach, next to the optimally selected centers, the covariance matrix of the width parameters of these centers can also be determined, thereby the network performance is enhanced. Training RBFN by GA can be accomplished in different ways with different merits. The main contribution of this research lies in its introduction of a novel method, which selects the subset optimally among the available centers. For this it uses orthogonal transformation method namely singular value decomposition (SVD)-QR method integrated to the algorithm. Along this line, GA approach for RBFN training as an alternative to OLS algorithm with additional desirable parameter optimization properties is discussed. By means of experiments described, competitiveness of the approach for robust RBFN training is presented.

[1]  Gerrit Kateman,et al.  Towards Solving Subset Selection Problems with the Aid of the Genetic Algorithm , 1992, PPSN.

[2]  Zekeriya Uykan,et al.  Analysis of input-output clustering for determining centers of RBFN , 2000, IEEE Trans. Neural Networks Learn. Syst..

[3]  Randy L. Haupt,et al.  Practical Genetic Algorithms , 1998 .

[4]  M. J. D. Powell,et al.  Radial basis functions for multivariable interpolation: a review , 1987 .

[5]  Jooyoung Park,et al.  Universal Approximation Using Radial-Basis-Function Networks , 1991, Neural Computation.

[6]  Mohamad T. Musavi,et al.  On the training of radial basis function classifiers , 1992, Neural Networks.

[7]  Zbigniew Michalewicz,et al.  Genetic Algorithms + Data Structures = Evolution Programs , 1996, Springer Berlin Heidelberg.

[8]  G. C. Mouzouris,et al.  Designing fuzzy logic systems for uncertain environments using a singular-value-QR decomposition method , 1996, Proceedings of IEEE 5th International Fuzzy Systems.

[9]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .

[10]  T. Söderström On model structure testing in system identification , 1977 .

[11]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[12]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[13]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[14]  Gene H. Golub,et al.  Matrix computations , 1983 .

[15]  Stephen A. Billings,et al.  Radial basis function network configuration using genetic algorithms , 1995, Neural Networks.

[16]  Witold Pedrycz,et al.  Conditional fuzzy clustering in the design of radial basis function neural networks , 1998, IEEE Trans. Neural Networks.

[17]  Özer Ciftcioglu,et al.  Knowledge management by information mining , 2001 .

[18]  John Yen,et al.  Simplifying fuzzy rule-based models using orthogonal transformation methods , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[19]  Rosalind W. Picard,et al.  On the efficiency of the orthogonal least squares training method for radial basis function networks , 1996, IEEE Trans. Neural Networks.

[20]  Özer Ciftcioglu,et al.  A consistent estimator for the model order of an autoregressive process , 1994, IEEE Trans. Signal Process..

[21]  Cheng-Liang Chen,et al.  Hybrid learning algorithm for Gaussian potential function networks , 1993 .

[22]  D.R. Hush,et al.  Progress in supervised neural networks , 1993, IEEE Signal Processing Magazine.

[23]  I. Jolliffe Principal Component Analysis , 2002 .

[24]  Dingli Yu,et al.  Selecting radial basis function network centers with recursive orthogonal least squares training , 2000, IEEE Trans. Neural Networks Learn. Syst..

[25]  D. Broomhead,et al.  Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks , 1988 .

[26]  H. Akaike A new look at the statistical model identification , 1974 .

[27]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[28]  S. Billings,et al.  Fast orthogonal identification of nonlinear stochastic models and radial basis function neural networks , 1996 .