A k-nearest neighbor artificial neural network classifier

The authors propose an artificial neural network architecture to implement the k-nearest neighbor (k-NN) classifier. This architecture employs a k-maximum network which has some advantages over the 'winner-take-all' type of networks and other techniques used to select the maximum input. This k-maximum network has fewer interconnections than other networks, and is able to select exactly k maximum inputs as long as its (k-1)/sup th/ and k/sup th/ maximum inputs are distinct. The classification performance of the k-NN classifier is exactly the same as that of the traditional k-NN classifier. However, the parallelism of the network greatly reduces the computational requirement of the traditional k-NN classifier. Unlike the multilayer perceptrons which involve slowly converging back-propagation algorithms, the k-NN artificial neural network classifier does not need any training algorithm after the initial setting of the weights.<<ETX>>

[1]  O. J. Murphy,et al.  Nearest neighbor pattern classification perceptrons , 1990, Proc. IEEE.

[2]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[3]  Richard P. Lippmann,et al.  An introduction to computing with neural nets , 1987 .

[4]  R.P. Lippmann,et al.  Pattern classification using neural networks , 1989, IEEE Communications Magazine.

[5]  Anil K. Jain,et al.  Small sample size problems in designing artificial neural networks , 1991 .

[6]  Peter E. Hart,et al.  The condensed nearest neighbor rule (Corresp.) , 1968, IEEE Trans. Inf. Theory.

[7]  Dennis L. Wilson,et al.  Asymptotic Properties of Nearest Neighbor Rules Using Edited Data , 1972, IEEE Trans. Syst. Man Cybern..

[8]  Anil K. Jain,et al.  Algorithms for Clustering Data , 1988 .

[9]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[10]  Michael T. Manry,et al.  Iterative improvement of a Gaussian classifier , 1990, Neural Networks.

[11]  Donald F. Specht,et al.  Probabilistic neural networks and the polynomial Adaline as complementary techniques for classification , 1990, IEEE Trans. Neural Networks.

[12]  Richard Lippmann,et al.  Neural Net and Traditional Classifiers , 1987, NIPS.

[13]  Keinosuke Fukunaga,et al.  A Branch and Bound Algorithm for Computing k-Nearest Neighbors , 1975, IEEE Transactions on Computers.

[14]  G. Gates,et al.  The reduced nearest neighbor rule (Corresp.) , 1972, IEEE Trans. Inf. Theory.

[15]  Ronald A. Cole,et al.  A performance comparison of trained multilayer perceptrons and trained classification trees , 1990 .

[16]  Ishwar K. Sethi,et al.  Entropy nets: from decision trees to neural networks , 1990, Proc. IEEE.