Robust Locality Preserving Projections With Cosine-Based Dissimilarity for Linear Dimensionality Reduction

Locality preserving projection (LPP) is a classical tool for dimensionality reduction problems. However, it is sensitive to outliers because of utilizing the <inline-formula> <tex-math notation="LaTeX">$\ell _{2}$ </tex-math></inline-formula>-norm-based distance criterion. In this paper, we propose a new approach, termed Euler-LPP, by preserving the local structures of data under the distance criterion of the cosine-based dissimilarity. Euler-LPP is robust to outliers in that the cosine-based dissimilarity suppresses the influence of outliers more efficiently than the <inline-formula> <tex-math notation="LaTeX">$\ell _{2}$ </tex-math></inline-formula>-norm. An explicit mapping, defined by a complex kernel (euler kernel) is adopted to map the data from the input space to complex reproducing kernel Hilbert spaces (CRKHSs), in which the distance of the data pairs under the <inline-formula> <tex-math notation="LaTeX">$\ell _{2}$ </tex-math></inline-formula>-norm is equal to that in the input space under the cosine-based dissimilarity. Thus, the robust dimensionality problem can be directly solved in CRKHS, where the solution is guaranteed to converge to a global minimum. In addition, Euler-LPP is easy to implement without significantly increasing computational complexity. Experiment results on several benchmark databases confirm the effectiveness of the proposed method.

[1]  Nojun Kwak,et al.  Principal Component Analysis by $L_{p}$ -Norm Maximization , 2014, IEEE Transactions on Cybernetics.

[2]  Feiping Nie,et al.  Trace Ratio Problem Revisited , 2009, IEEE Transactions on Neural Networks.

[3]  Stefanos Zafeiriou,et al.  Principal Component Analysis With Complex Kernel: The Widely Linear Model , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Ran He,et al.  Maximum Correntropy Criterion for Robust Face Recognition , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  William J. Christmas,et al.  Fast robust correlation , 2005, IEEE Transactions on Image Processing.

[6]  Hendra Gunawan,et al.  A formula for angles between subspaces of inner product spaces. , 2005 .

[7]  Xuesong Lu,et al.  Fisher Discriminant Analysis With L1-Norm , 2014, IEEE Transactions on Cybernetics.

[8]  Jiawei Han,et al.  Orthogonal Laplacianfaces for Face Recognition , 2006, IEEE Transactions on Image Processing.

[9]  Yuxiao Hu,et al.  Face recognition using Laplacianfaces , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Are Hjrungnes,et al.  Complex-Valued Matrix Derivatives: With Applications in Signal Processing and Communications , 2011 .

[11]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[12]  Nojun Kwak,et al.  Principal Component Analysis Based on L1-Norm Maximization , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[14]  Xiaolong Teng,et al.  Face recognition using discriminant locality preserving projections , 2006, Image Vis. Comput..

[15]  Feiping Nie,et al.  Robust Principal Component Analysis with Non-Greedy l1-Norm Maximization , 2011, IJCAI.

[16]  Stefanos Zafeiriou,et al.  Subspace Learning from Image Gradient Orientations , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Xuelong Li,et al.  L1-Norm-Based 2DPCA , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[18]  Jiashu Zhang,et al.  Robust locality preserving projection based on maximum correntropy criterion , 2014, J. Vis. Commun. Image Represent..

[19]  Jian Yang,et al.  KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[21]  Yuan Yuan,et al.  Outlier-resisting graph embedding , 2010, Neurocomputing.

[22]  Ran He,et al.  Robust Principal Component Analysis Based on Maximum Correntropy Criterion , 2011, IEEE Transactions on Image Processing.

[23]  Jiashu Zhang,et al.  Linear Discriminant Analysis Based on L1-Norm Maximization , 2013, IEEE Transactions on Image Processing.

[24]  Stephan Liwicki Robust online subspace learning , 2014 .

[25]  Jian-Huang Lai,et al.  Euler Clustering , 2013, IJCAI.

[26]  Avinash C. Kak,et al.  PCA versus LDA , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[27]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[28]  Xinbo Gao,et al.  Stable Orthogonal Local Discriminant Embedding for Linear Dimensionality Reduction , 2013, IEEE Transactions on Image Processing.

[29]  Bao-Gang Hu,et al.  Robust feature extraction via information theoretic learning , 2009, ICML '09.

[30]  Stefanos Zafeiriou,et al.  Euler Principal Component Analysis , 2013, International Journal of Computer Vision.

[31]  Feiping Nie,et al.  Robust Distance Metric Learning via Simultaneous L1-Norm Minimization and Maximization , 2014, ICML.