Radial basis function neural networks for nonlinear Fisher discrimination and Neyman-Pearson classification

We propose a novel technique for the design of radial basis function (RBF) neural networks (NNs). To select various RBF parameters, the class membership information of training samples is utilized to produce new cluster classes. This allows emphasis of classification performance for certain class data rather than best overall classification. This allows us to control performance as desired and to approximate Neyman-Pearson classification. We also show that by properly choosing the desired output neuron levels, then the RBF hidden to output layer performs Fisher discrimination analysis, and that the full system performs a nonlinear Fisher analysis. Data on an agricultural product inspection problem and on synthetic data confirm the effectiveness of these methods.

[1]  E. S. Pearson,et al.  On the Problem of the Most Efficient Tests of Statistical Hypotheses , 1933 .

[2]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[3]  Michael Vogt Combination of radial basis function neural networks with optimized learning vector quantization , 1993, IEEE International Conference on Neural Networks.

[4]  Mahmood R. Azimi-Sadjadi,et al.  Detection of mines and minelike targets using principal component and neural-network methods , 1998, IEEE Trans. Neural Networks.

[5]  Richard Lippmann,et al.  Neural Network Classifiers Estimate Bayesian a posteriori Probabilities , 1991, Neural Computation.

[6]  Mohamad T. Musavi,et al.  On the training of radial basis function classifiers , 1992, Neural Networks.

[7]  Lorenzo Bruzzone,et al.  Supervised training technique for radial basis function neural networks , 1998 .

[8]  David Casasent,et al.  New training strategies for RBF neural networks for X-ray agricultural product inspection , 2003, Pattern Recognit..

[9]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[10]  Andrew Chi-Sing Leung,et al.  An Adaptive Bayesian Pruning for Neural Networks in a Non-Stationary Environment , 1999, Neural Computation.

[11]  Adam Krzyzak,et al.  Nonparametric estimation and classification using radial basis function nets and empirical risk minimization , 1996, IEEE Trans. Neural Networks.

[12]  Sheng Chen,et al.  A clustering technique for digital communications channel equalization using radial basis function networks , 1993, IEEE Trans. Neural Networks.

[13]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[14]  David Casasent,et al.  X-ray agricultural product inspection: segmentation and classification , 1997, Other Conferences.

[15]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[16]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[17]  E. S. Pearson,et al.  On the Problem of the Most Efficient Tests of Statistical Hypotheses , 1933 .

[18]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[19]  Stephen A. Billings,et al.  Radial basis function network configuration using genetic algorithms , 1995, Neural Networks.

[20]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[21]  I. Bankman,et al.  Feature-based detection of the K-complex wave in the human electroencephalogram using neural networks , 1992, IEEE Transactions on Biomedical Engineering.