Orthogonal margin discriminant projection for dimensionality reduction

Dimensionality reduction aims to represent high-dimensional data with much smaller number of features, which plays as a preprocessing step to remove the insignificant and irrelevant features in many machine learning applications, resulting in lower computational cost and better performance of classifiers. In most cases, the data points can be well classified with margin samples which are defined as furthest intra-class samples and nearest inter-class samples. Motivated by this observation, this paper proposes a linear supervised dimensionality reduction method called orthogonal margin discriminant projection (OMDP). After OMDP projection, intra-class data points become more compact and inter-class data points become more separated. Extensive experiments have been conducted to evaluate the proposed OMDP algorithm using several benchmark face data sets. The experimental results confirm the effectiveness of the proposed method.

[1]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[2]  Fei Wang,et al.  Feature Extraction by Maximizing the Average Neighborhood Margin , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[3]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[4]  张振跃,et al.  Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment , 2004 .

[5]  Kilian Q. Weinberger,et al.  Learning a kernel matrix for nonlinear dimensionality reduction , 2004, ICML.

[6]  Alex Pentland,et al.  Face recognition using eigenfaces , 1991, Proceedings. 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[8]  Masashi Sugiyama,et al.  Local Fisher discriminant analysis for supervised dimensionality reduction , 2006, ICML.

[9]  Jian Yang,et al.  Quotient vs. difference: Comparison between the two discriminant criteria , 2010, Neurocomputing.

[10]  Jiawei Han,et al.  Orthogonal Laplacianfaces for Face Recognition , 2006, IEEE Transactions on Image Processing.

[11]  Marc Toussaint,et al.  Probabilistic inference for solving discrete and continuous state Markov Decision Processes , 2006, ICML.

[12]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[13]  Lide Wu,et al.  Face recognition by stepwise nonparametric margin maximum criterion , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[14]  Amos Storkey,et al.  Advances in Neural Information Processing Systems 20 , 2007 .

[15]  Tao Jiang,et al.  Efficient and robust feature extraction by maximum margin criterion , 2003, IEEE Transactions on Neural Networks.

[16]  Xinbo Gao,et al.  Stable Orthogonal Local Discriminant Embedding for Linear Dimensionality Reduction , 2013, IEEE Transactions on Image Processing.

[17]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[18]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[19]  Fei Wang,et al.  Maximum Margin Embedding , 2008, 2008 Eighth IEEE International Conference on Data Mining.

[20]  Stephen Lin,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[22]  Kilian Q. Weinberger,et al.  Distance Metric Learning for Large Margin Nearest Neighbor Classification , 2005, NIPS.