Classifiers in almost empty spaces

Recent developments in defining and training statistical classifiers make it possible to build reliable classifiers in very small sample size problems. Using these techniques advanced problems may be tackled, such as pixel based image recognition and dissimilarity based object classification. It can be explained and illustrated how recognition systems based on support vector machines and subspace classifiers circumvent the curse of dimensionality, and even may find nonlinear decision boundaries for small training sets represented in Hilbert space.

[1]  Robert P. W. Duin,et al.  Support objects for domain approximation , 1998 .

[2]  C. R. Rao,et al.  The Utilization of Multiple Measurements in Problems of Biological Classification , 1948 .

[3]  Robert P. W. Duin,et al.  Expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix , 1998, Pattern Recognit. Lett..

[4]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[5]  Robert P. W. Duin Relational discriminant analysis and its large sample size problem , 1998, Proceedings. Fourteenth International Conference on Pattern Recognition (Cat. No.98EX170).

[6]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[7]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[8]  R. Duin Small sample size generalization , 1995 .

[9]  Anil K. Jain,et al.  39 Dimensionality and sample size considerations in pattern recognition practice , 1982, Classification, Pattern Recognition and Reduction of Dimensionality.

[10]  Erkki Oja,et al.  Subspace methods of pattern recognition , 1983 .

[11]  M. Narasimha Murty,et al.  Growing subspace pattern recognition methods and their neural-network models , 1997, IEEE Trans. Neural Networks.

[12]  Anil K. Jain,et al.  Representation and Recognition of Handwritten Digits Using Deformable Templates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  S. Katagiri,et al.  Discriminative Subspace Method for Minimum Error Pattern Recognition , 1995, Proceedings of 1995 IEEE Workshop on Neural Networks for Signal Processing.

[14]  Robert P. W. Duin,et al.  Neural network experiences between perceptrons and support vectors , 1997, BMVC.

[15]  Robert P. W. Duin,et al.  Classifiers for dissimilarity-based pattern recognition , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[16]  Bernhard Schölkopf,et al.  Support vector learning , 1997 .

[17]  Anil K. Jain,et al.  Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[18]  Robert P. W. Duin,et al.  Experiments with a featureless approach to pattern recognition , 1997, Pattern Recognit. Lett..

[19]  Robert P. W. Duin,et al.  Featureless pattern classification , 1998, Kybernetika.

[20]  Thomas M. Cover,et al.  Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..

[21]  Shigeaki Watanabe,et al.  Subspace method to pattern recognition , 1973 .

[22]  Robert P. W. Duin,et al.  Relational discriminant analysis , 1999, Pattern Recognit. Lett..