Probabilistic neural network architecture for high-speed classification of remotely sensed imagery

Abstract In this article we discuss a neural network architecture (the probabilistic neural net or the PNN) that, to the best of our knowledge, has not previously been applied to remotely sensed data. The PNN is a supervised nonparametric classification algorithm as opposed to the Gaussian maximum likelihood classifier (GMLC). The PNN works by fitting a Gaussian kernel to each training point. The width of the Gaussian is controlled by a tuning parameter called the window width. If very small widths are used, the method is equivalent to the nearest neighbor method. For large windows, the PNN behaves like the GMLC. The basic implementation of the PNN requires no training time at all. In this respect it is far better than the commonly used backpropagation neural network (BPNN), which can be shown to the O(N6) time for training where N is the dimensionality of the input vector. In addition, the PNN can be implemented in a feed-forward mode in hardware. The disadvantage of the PNN is that it requires all the training data to be stored. Some solutions to this problem are discussed in the article. Finally, we discuss the accuracy of the PNN with respect to the GMLC and the BPNN. The PNN is shown to be better than GMLC and not as good as the BPNN with regard to classification accuracy.

[1]  Robert F. Cromp,et al.  Automatic labeling and characterization of objects using artificial neural networks , 1989 .

[2]  Nicholas V. Findler Contributions to a Computer-Based Theory of Strategies , 1990, Springer Berlin Heidelberg.

[3]  Pietro Burrascano,et al.  Learning vector quantization for the probabilistic neural network , 1991, IEEE Trans. Neural Networks.

[4]  Teuvo Kohonen,et al.  Self-organization and associative memory: 3rd edition , 1989 .

[5]  Terrence J. Sejnowski,et al.  Parallel Networks that Learn to Pronounce English Text , 1987, Complex Syst..

[6]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[7]  R. Tapia,et al.  Nonparametric Probability Density Estimation , 1978 .

[8]  R. Kronmal,et al.  An Introduction to the Implementation and Theory of Nonparametric Density Estimation , 1976 .

[9]  David A. Landgrebe,et al.  Predicting the Required Number of Training Samples , 1983, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  K. Szekielda Satellite monitoring of the earth , 1988 .

[11]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory, Third Edition , 1989, Springer Series in Information Sciences.

[12]  Heinz Mühlenbein,et al.  Limitations of multi-layer perceptron networks-steps towards genetic neural networks , 1990, Parallel Comput..

[13]  C. D. Kemp,et al.  Density Estimation for Statistics and Data Analysis , 1987 .

[14]  Mohamad T. Musavi,et al.  On the training of radial basis function classifiers , 1992, Neural Networks.

[15]  Earl E. Swartzlander,et al.  Introduction to Mathematical Techniques in Pattern Recognition , 1973 .

[16]  Robert F. Cromp,et al.  Design of neural networks for classification of remotely sensed imagery , 1992 .

[17]  Stephen Grossberg,et al.  A massively parallel architecture for a self-organizing neural pattern recognition machine , 1988, Comput. Vis. Graph. Image Process..

[18]  Donald F. Specht,et al.  Probabilistic neural networks , 1990, Neural Networks.

[19]  Patrick K. Simpson,et al.  Artificial Neural Systems: Foundations, Paradigms, Applications, and Implementations , 1990 .

[20]  Sholom M. Weiss,et al.  An Empirical Comparison of Pattern Recognition, Neural Nets, and Machine Learning Classification Methods , 1989, IJCAI.

[21]  J. Campbell Introduction to remote sensing , 1987 .

[22]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[23]  James R. Anderson,et al.  A land use and land cover classification system for use with remote sensor data , 1976 .