Subspace Ensembles for Face Recognition

In many cases, the statistical learning methods used for solving the face recognition problem tend to be overfitted to the training data. It is because of small number of individual faces and large number of features in the face space. Utilizing a single subspace, these methods can also suffer from different intra-personal variations. One way for handling these problems is to construct multiple subspaces from the original face space. In this paper a novel ensemble framework based on random subspace method is presented. We focus on Fisherface-based face recognition approach and propose to use classifier selection methodology to improve the random subspace LDA technique. Our method uses a small subset of complementary classifiers from the pool of randomly generated classifiers and by combining their decisions via a trainable combiner (Mixture of Experts), tries to achieve high classification accuracy with lower operating complexity. Experiments on the Cohn-Kanade Expression Database demonstrate the satisfactory performance of the proposed method as compared to Eigenface, Fisherface and random subspace LDA methods.

[1]  Ke Chen A connectionist method for pattern classification with diverse features , 1998, Pattern Recognit. Lett..

[2]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[3]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[4]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[5]  Sahibsingh A. Dudani The Distance-Weighted k-Nearest-Neighbor Rule , 1976, IEEE Transactions on Systems, Man, and Cybernetics.

[6]  Xiaogang Wang,et al.  Random Sampling for Subspace Face Recognition , 2006, International Journal of Computer Vision.

[7]  Reza Ebrahimpour,et al.  Teacher-directed learning in view-independent face recognition with mixture of experts using single-view eigenspaces , 2008, J. Frankl. Inst..

[8]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[9]  Bogdan Gabrys,et al.  Classifier selection for majority voting , 2005, Inf. Fusion.

[10]  Robert A. Jacobs,et al.  Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.

[11]  Bogdan Gabrys,et al.  Set Analysis of Coincident Errors and Its Applications for Combining Classifiers , 2003 .

[12]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Ludmila I. Kuncheva,et al.  Relationships between combination methods and measures of diversity in combining classifiers , 2002, Inf. Fusion.

[14]  HoTin Kam The Random Subspace Method for Constructing Decision Forests , 1998 .

[15]  Ludmila I. Kuncheva,et al.  Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.

[16]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.