Orthogonal canonical correlation analysis and its application in feature fusion

Canonical correlation analysis (CCA) is an important method for multiple feature extraction and fusion. The canonical projective vectors in classical CCA method satisfy conjugated orthogonality constraints. However, the conjugated orthogonality property is badly affected by the small sample size (SSS) problem so that the projections in the classical CCA method are usually not optimal in such case for recognition purpose. Orthogonality, as a common criterion, is widely used in feature extraction. Moreover, it is less affected by poor estimation of covariance matrix. In this paper, we propose a novel canonical correlation analysis method with orthogonality constraints called Orthogonal CCA (OCCA). We replace the conjugated orthogonality constraints in classical CCA by introducing orthogonality constraints, and then a novel algorithm based on twin eigen decomposition is proposed to gradually obtain the orthogonal canonical projective vectors. Experimental results on UCI multiple feature database, and ORL face database show that OCCA has better recognition rates and robustness than previous feature fusion methods.

[1]  Pengfei Shi,et al.  A Novel Method of Combined Feature Extraction for Recognition , 2008, 2008 Eighth IEEE International Conference on Data Mining.

[2]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[3]  Yan Liu,et al.  A new method of feature fusion and its application in image recognition , 2005, Pattern Recognit..

[4]  Songcan Chen,et al.  Locality preserving CCA with applications to data visualization and pose estimation , 2007, Image Vis. Comput..

[5]  Jiawei Han,et al.  Orthogonal Laplacianfaces for Face Recognition , 2006, IEEE Transactions on Image Processing.

[6]  Yuxiao Hu,et al.  Face recognition using Laplacianfaces , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Jieping Ye,et al.  Characterization of a Family of Algorithms for Generalized Discriminant Analysis on Undersampled Problems , 2005, J. Mach. Learn. Res..

[8]  Jian Yang,et al.  Feature fusion: parallel strategy vs. serial strategy , 2003, Pattern Recognit..

[9]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[10]  L. Duchene,et al.  An Optimal Transformation for Discriminant and Principal Component Analysis , 1988, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  John Shawe-Taylor,et al.  Canonical Correlation Analysis: An Overview with Application to Learning Methods , 2004, Neural Computation.

[12]  H. Hotelling Relations Between Two Sets of Variates , 1936 .

[13]  Jian Yang,et al.  Generalized K-L transform based combined feature extraction , 2002, Pattern Recognit..

[14]  Pheng-Ann Heng,et al.  A theorem on the generalized canonical projective vectors , 2005, Pattern Recognit..

[15]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[16]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[17]  Yun Fu,et al.  Multiple feature fusion by subspace learning , 2008, CIVR '08.

[18]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[19]  Chengjun Liu,et al.  A shape- and texture-based enhanced Fisher classifier for face recognition , 2001, IEEE Trans. Image Process..

[20]  Horst Bischof,et al.  Appearance models based on kernel canonical correlation analysis , 2003, Pattern Recognit..

[21]  Yousef Saad,et al.  Orthogonal Neighborhood Preserving Projections: A Projection-Based Dimensionality Reduction Technique , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[22]  Delin Chu,et al.  A New and Fast Orthogonal Linear Discriminant Analysis on Undersampled Problems , 2010, SIAM J. Sci. Comput..