In this paper we present a linear pattern classification algorithm, Principal Component Null Space Analysis (PCNSA) which uses only the first and second order statistics of data for classification and compare its performance with existing linear algorithms. PCNSA first projects data into the PCA space in order to maximize between class variance and then finds separate directions for each class in the PCA space along which the class has the least variance (in an ideal situation the null space of the within class covariance matrix) which we define as the "approximate null space" (ANS) of the class. To obtain the ANS, we calculate the covariance matrix of the class data in PCA space and find its eigenvectors with least eigenvalues. The method works on the assumption that an ANS of the within-class covariance matrix exists, which is true for many classification problems. A query is classified as belonging to the class for which its distance from the class mean projected along the ANS of the class is a minimum. Results for PCNSA's superior performance over LDA and PCA are shown for object recognition.
[1]
Juyang Weng,et al.
Using Discriminant Eigenfeatures for Image Retrieval
,
1996,
IEEE Trans. Pattern Anal. Mach. Intell..
[2]
Georges Bienvenu.
Influence of the spatial coherence of the background noise on high resolution passive methods
,
1979,
ICASSP.
[3]
David J. Kriegman,et al.
Recognition using class specific linear projection
,
1997
.
[4]
Marian Stewart Bartlett,et al.
Independent component representations for face recognition
,
1998,
Electronic Imaging.
[5]
M. Turk,et al.
Eigenfaces for Recognition
,
1991,
Journal of Cognitive Neuroscience.
[6]
R. Chellappa,et al.
Subspace Linear Discriminant Analysis for Face Recognition
,
1999
.
[7]
David J. Kriegman,et al.
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
,
1996,
ECCV.
[8]
Avinash C. Kak,et al.
PCA versus LDA
,
2001,
IEEE Trans. Pattern Anal. Mach. Intell..