In this paper we demonstrate the subspace generalization power of the kernel correlation feature analysis (KCFA) method for extracting a low dimensional subspace that has the ability to represent new unseen datasets. Examining the portability of this algorithm across different datasets is an important practical aspect of real-world face recognition applications where the technology cannot be dataset-dependant. In most face recognition literature, algorithms are demonstrated on datasets by training on one portion of the dataset and testing on the remainder. Generally, the testing subjects' dataset partially or totally overlap the training subjects' dataset however with disjoint images captured from different sessions. Thus, some of the expected facial variations and the people's faces are modeled in the training set. In this paper we describe how we efficiently build a compact feature subspace using kernel correlation filter analysis on the generic training set of the FRGC dataset and use that basis for recognition on a different dataset. The KCFA feature subspace has a total dimension that corresponds to the number of training subjects; we chose to vary this number to include up to all of 222 available in the FRGC generic dataset. We test the built subspace produced by KCFA by projecting other well-known face datasets upon it. We show that this feature subspace has good representation and discrimination to unseen datasets and produces good verification and identification rates compared to other subspace and dimensionality reduction methods such as PCA (when trained on the same FRGC generic dataset). Its efficiency, lower dimensionality and discriminative power make it more practical and powerful than PCA as a robust lower dimensionality reduction method for modeling faces and facial variations.
[1]
Keinosuke Fukunaga,et al.
Introduction to Statistical Pattern Recognition
,
1972
.
[2]
Aleix M. Martinez,et al.
The AR face database
,
1998
.
[3]
Hyeonjoon Moon,et al.
The FERET Evaluation Methodology for Face-Recognition Algorithms
,
2000,
IEEE Trans. Pattern Anal. Mach. Intell..
[4]
Ming-Hsuan Yang,et al.
Kernel Eigenfaces vs. Kernel Fisherfaces: Face recognition using kernel methods
,
2002,
Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.
[5]
Terence Sim,et al.
The CMU Pose, Illumination, and Expression Database
,
2003,
IEEE Trans. Pattern Anal. Mach. Intell..
[6]
D. Casasent,et al.
Minimum average correlation energy filters.
,
1987,
Applied optics.
[7]
Ja-Chen Lin,et al.
A new LDA-based face recognition system which can solve the small sample size problem
,
1998,
Pattern Recognit..
[8]
P. Khosla,et al.
Face Verification using Correlation Filters
,
2002
.
[9]
Ralph Gross,et al.
An Image Preprocessing Algorithm for Illumination Invariant Face Recognition
,
2003,
AVBPA.
[10]
Marios Savvides,et al.
Redundant Class-Dependence Feature Analysis Based on Correlation Filters Using FRGC2.0 Data
,
2005,
2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.
[11]
David J. Kriegman,et al.
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
,
1996,
ECCV.
[12]
D Casasent,et al.
Unified synthetic discriminant function computational formulation.
,
1984,
Applied optics.
[13]
M. Turk,et al.
Eigenfaces for Recognition
,
1991,
Journal of Cognitive Neuroscience.