This paper describes a new classification technique named the Local Subspace Classifier (LSC). The algorithm is closely related to the subspace classification methods. On the other hand, it is an heir of prototype classification methods, such as the k-NN rule. Therefore, it is argued that the LSC technique fills the gap between the subspace and prototype principles of classification. From the domain of the prototypebased classifiers, LSC brings the benefits related to the local nature of the classification, while it simultaneously utilizes the capability of the subspace classifiers to produce generalizations from the training sample. A further enhancement of the LSC principle named the Convex Local Subspace Classifier (LSC+) is also presented. The good classification accuracy obtainable with the LSC and LSC+ classifiers is demonstrated with experiments, including the classification of the publicly available data sets of the StatLog project.
[1]
Erkki Oja,et al.
Neural and statistical classifiers-taxonomy and two case studies
,
1997,
IEEE Trans. Neural Networks.
[2]
David J. Spiegelhalter,et al.
Machine Learning, Neural and Statistical Classification
,
2009
.
[3]
Teuvo Kohonen,et al.
Self-Organizing Maps
,
2010
.
[4]
Erkki Oja,et al.
Subspace methods of pattern recognition
,
1983
.
[5]
Erkki Oja,et al.
Classification with learning k-nearest neighbors
,
1996,
Proceedings of International Conference on Neural Networks (ICNN'96).