Relational discriminant analysis and its large sample size problem

Relational discriminant analysis is based on a similarity matrix of the training set. It is able to construct reliable nonlinear discriminants in infinite dimensional feature spaces based on small training sets. This technique has a large sample size problem as the size of the similarity matrix equals the square of the number of objects in the training set. We discuss and initially evaluate a solution that drastically decreases training times and memory demands.

[1]  Lawrence D. Jackel,et al.  Backpropagation Applied to Handwritten Zip Code Recognition , 1989, Neural Computation.

[2]  Robert P. W. Duin,et al.  Stabilizing classifiers for very small sample sizes , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[3]  Robert P. W. Duin,et al.  Neural network experiences between perceptrons and support vectors , 1997, BMVC.

[4]  Anil K. Jain,et al.  Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Anil K. Jain,et al.  39 Dimensionality and sample size considerations in pattern recognition practice , 1982, Classification, Pattern Recognition and Reduction of Dimensionality.

[6]  Dick de Ridder,et al.  Shared Weights Neural Networks in Image Analysis , 1996 .

[7]  Robert P. W. Duin,et al.  Regularization by Adding Redundant Features , 1998, SSPR/SPR.

[8]  Bernhard Schölkopf,et al.  Support vector learning , 1997 .

[9]  Robert P. W. Duin,et al.  Experiments with a featureless approach to pattern recognition , 1997, Pattern Recognit. Lett..

[10]  Bernhard Schölkopf,et al.  Improving the Accuracy and Speed of Support Vector Machines , 1996, NIPS.

[11]  R. Duin Small sample size generalization , 1995 .

[12]  Robert P. W. Duin,et al.  Expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix , 1998, Pattern Recognit. Lett..

[13]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.