A novel fusion-based method for expression-invariant gender classification

In this paper, we propose a novel fusion-based gender classification method that is able to compensate for facial expression even when training samples contain only neutral expression. We perform experimental investigation to evaluate the significance of different facial regions in the task of gender classification. Three most significant regions are used in our fusion-based method. The classification is performed by using support vector machines based on the features extracted using two-dimension principal component analysis. Experiments show that our fusion-based method is able to compensate for facial expressions and obtained the highest correct classification rate of 95.33%.

[1]  Ming-Hsuan Yang,et al.  Learning Gender with Support Faces , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[3]  Jian Yang,et al.  Two-dimensional PCA: a new approach to appearance-based face representation and recognition , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Wen Gao,et al.  The CAS-PEAL Large-Scale Chinese Face Database and Baseline Evaluations , 2008, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[5]  Gene H. Golub,et al.  Matrix computations , 1983 .

[6]  Jian Yang,et al.  Two-dimensional discriminant transform for face recognition , 2005, Pattern Recognit..