A Modified General Regression Neural Network (MGRNN) with new, efficient training algorithms as a robust 'black box'-tool for data analysis

A Modified General Regression Neural Network (MGRNN) is presented as an easy-to-use 'black box'-tool to feed in available data and obtain a reasonable regression surface. The MGRNN is based on the General Regression Neural Network by D. Specht [Specht, D. (1991). A General Regression Neural Network. IEEE Transactions on Neural Networks, 2(6), 568-576], therefore, the network's architecture and weights are determined. The kernel width of each training sample is trained by two supervised training algorithms. These fast and reliable algorithms require four user-definable parameters, but are robust against changes of the parameters. Its generalization ability was tested with different benchmarks: intertwined spirals, Mackey-Glass time series and PROBEN1. The MGRNN provides two additional features: (1) it is trainable with arbitrary data as long as a suitable metric exists. Particularly, it is unnecessary to force the data structure to vectors of equal length; (2) it is able to compute the gradient of the regression surface as long as the gradient of the metric is definable and defined. The MGRNN solves common practical problems of common feed-forward networks.

[1]  K. Lang,et al.  Learning to tell two spirals apart , 1988 .

[2]  Donald F. Specht,et al.  The general regression neural network - Rediscovered , 1993, Neural Networks.

[3]  G. S. Watson,et al.  Smooth regression analysis , 1964 .

[4]  D. F. Specht,et al.  Experience with adaptive probabilistic neural networks and adaptive general regression neural networks , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[5]  William H. Press,et al.  The Art of Scientific Computing Second Edition , 1998 .

[6]  Lutz Prechelt,et al.  PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms , 1994 .

[7]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[8]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[9]  Eric D. Salin,et al.  Improved Calibration for Inductively Coupled Plasma-Atomic Emission Spectrometry Using Generalized Regression Neural Networks , 1995 .

[10]  D. F. Specht,et al.  Generalization accuracy of probabilistic neural networks compared with backpropagation networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[11]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[12]  Timothy Masters,et al.  Advanced algorithms for neural networks: a C++ sourcebook , 1995 .

[13]  Uwe Hartmann,et al.  Mapping neural network derived from the parzen window estimator , 1992, Neural Networks.

[14]  Timothy Masters,et al.  Advanced algorithms for neural networks: a C++ sourcebook , 1995 .

[15]  T. Cacoullos Estimation of a multivariate density , 1966 .

[16]  James V. Hansen,et al.  Learning experiments with genetic optimization of a generalized regression neural network , 1996, Decis. Support Syst..

[17]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[18]  E. Nadaraya On Estimating Regression , 1964 .

[19]  Donald F. Specht,et al.  A general regression neural network , 1991, IEEE Trans. Neural Networks.