Correlation Embedding Analysis

To design geometrically motivated approaches for classifying the high-dimensional data, we propose to learn a discriminant subspace using Correlation Embedding Analysis (CEA). This novel algorithm enhances its discriminant power by incorporating both correlational graph embedding and Fisher criterion. In a geometric interpretation, it projects the high- dimensional data onto a hypersphere and preserves intrinsic neighbor relations with the Pearson correlation metric. After the embedding, resulting data pairs from the same class are forced to enhance their correlation affinity, whereas neighboring points of different class are forced to reduce their correlation affinity at the same time. The feature learned by CEA is tolerable to scaling or outlier. Experiments on face recognition demonstrate the effectiveness and advantage of the CEA.

[1]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[2]  Josef Kittler,et al.  Discriminative Learning and Recognition of Image Set Classes Using Canonical Correlations , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression (PIE) database , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[4]  Yuxiao Hu,et al.  Discriminant Analysis on Embedded Manifold , 2004, ECCV.

[5]  Kun Zhou,et al.  Locality Sensitive Discriminant Analysis , 2007, IJCAI.

[6]  Shuicheng Yan,et al.  Graph embedding: a general framework for dimensionality reduction , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[7]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[8]  Yun Fu,et al.  Conformal Embedding Analysis with Local Graph Modeling on the Unit Hypersphere , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[9]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression Database , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  B. V. K. Vijaya Kumar,et al.  Correlation Pattern Recognition , 2002 .

[11]  Nicolas Le Roux,et al.  Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering , 2003, NIPS.

[12]  Inderjit S. Dhillon,et al.  Clustering on the Unit Hypersphere using von Mises-Fisher Distributions , 2005, J. Mach. Learn. Res..

[13]  Bernhard Schölkopf,et al.  A kernel view of the dimensionality reduction of manifolds , 2004, ICML.

[14]  Shihong Lao,et al.  Discriminant analysis in correlation similarity measure space , 2007, ICML '07.

[15]  Yun Fu,et al.  Unsupervised Locally Embedded Clustering for Automatic High-Dimensional Data Labeling , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.

[16]  Kilian Q. Weinberger,et al.  Unsupervised Learning of Image Manifolds by Semidefinite Programming , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[17]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[18]  Hwann-Tzong Chen,et al.  Local discriminant embedding and its variants , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[19]  Yuxiao Hu,et al.  Face recognition using Laplacianfaces , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  Amir Globerson,et al.  Metric Learning by Collapsing Classes , 2005, NIPS.

[21]  Kurt Hornik,et al.  Local PCA algorithms , 2000, IEEE Trans. Neural Networks Learn. Syst..

[22]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.