Using K-Winner Machines for domain analysis

The K-Winner Machine (KWM) model combines unsupervised with supervised training paradigms, and builds up a family of nested classifiers that differ in their expected generalization performances. A KWM allows members of the classifier family to reject a test pattern, and predicting the rejection rate is a crucial issue to the ultimate method effectiveness. The aspects involved by the analytical properties of the KWM first drive a theoretical analysis of the rejection performance. Then the paper shows that the KWM classification process can also be profitably used for domain inspection. Novel theorems connect the outputs of KWMs directly to the class-separating boundaries in the data space. Empirical evidence eventually supports the intuitive result that smaller confidence values characterize boundary regions.

[1]  Yong Liu,et al.  Unbiased estimate of generalization error and model selection in neural network , 1995, Neural Networks.

[2]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Jason Weston,et al.  Multi-Class Support Vector Machines , 1998 .

[4]  S. P. Lloyd,et al.  Least squares quantization in PCM , 1982, IEEE Trans. Inf. Theory.

[5]  Sandro Ridella,et al.  Plastic algorithm for adaptive vector quantisation , 1998, Neural Computing & Applications.

[6]  M. Hulle Kernel-Based Equiprobabilistic Topographic Map Formation , 1998, Neural Computation.

[7]  Sandro Ridella,et al.  Empirical measure of multiclass generalization performance: the K-winner machine case , 2001, IEEE Trans. Neural Networks.

[8]  Lutz Prechelt,et al.  Automatic early stopping using cross validation: quantifying the criteria , 1998, Neural Networks.

[9]  John Shawe-Taylor,et al.  Sample sizes for multiple-output threshold networks , 1991 .

[10]  Duane DeSieno,et al.  Adding a conscience to competitive learning , 1988, IEEE 1988 International Conference on Neural Networks.

[11]  Jorma Rissanen,et al.  Stochastic Complexity in Statistical Inquiry , 1989, World Scientific Series in Computer Science.

[12]  PrecheltLutz Automatic early stopping using cross validation , 1998 .

[13]  Thomas Martinetz,et al.  'Neural-gas' network for vector quantization and its application to time-series prediction , 1993, IEEE Trans. Neural Networks.

[14]  H. Akaike A new look at the statistical model identification , 1974 .

[15]  Fuad Rahman,et al.  Multiple classifier decision combination strategies for character recognition: A review , 2003, Document Analysis and Recognition.

[16]  Sandro Ridella,et al.  Digital implementation of hierarchical vector quantization , 2003, IEEE Trans. Neural Networks.

[17]  Sandro Ridella,et al.  K-winner machines for pattern classification , 2001, IEEE Trans. Neural Networks.

[18]  Rodolfo Zunino,et al.  Vector quantization for license-plate location and image coding , 2000, IEEE Trans. Ind. Electron..

[19]  Davide Anguita,et al.  Hyperparameter design criteria for support vector classifiers , 2003, Neurocomputing.

[20]  Vladimir Cherkassky,et al.  Learning from Data: Concepts, Theory, and Methods , 1998 .

[21]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[22]  Nicolaos B. Karayiannis,et al.  Growing radial basis neural networks: merging supervised and unsupervised learning with network growth techniques , 1997, IEEE Trans. Neural Networks.

[23]  Louis Vuurpijl,et al.  Architectures for detecting and solving conflicts: two-stage classification and support vector classifiers , 2003, Document Analysis and Recognition.

[24]  Klaus-Robert Müller,et al.  Asymptotic statistical theory of overtraining and cross-validation , 1997, IEEE Trans. Neural Networks.

[25]  Fuad Rahman,et al.  An Evaluation Of Multi-Expert Configurations For The Recognition Of Handwritten Numerals , 1998, Pattern Recognit..

[26]  Sandro Ridella,et al.  Circular backpropagation networks embed vector quantization , 1999, IEEE Trans. Neural Networks.

[27]  Hava T. Siegelmann,et al.  A Support Vector Method for Clustering , 2000, NIPS.

[28]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[29]  Nicolaos B. Karayiannis,et al.  Growing radial basis neural networks , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).

[30]  Rodolfo Zunino,et al.  Efficient training of neural gas vector quantizers with analog circuit implementation , 1999 .