Laplacian Eigenmaps modification using adaptive graph for pattern recognition

Laplacian Eigenmaps (LE) is a typical nonlinear graph-based (manifold) dimensionality reduction (DR) method, applied to many practical problems such as pattern recognition and spectral clustering. It is generally difficult to assign appropriate values for the neighborhood size and heat kernel parameter for LE graph construction. In this paper, we modify graph construction by learning a graph in the neighborhood of a pre-specified one. Moreover, the pre-specified graph is treated as a noisy observation of the ideal one, and the square Frobenius divergence is used to measure their difference in the objective function. In this way, we obtain a simultaneous learning frame work for graph construction and projection optimization. As a result, we obtain a principled edge weight updating formula which naturally corresponds to classical heat kernel weights. Experimental result using UCI datasets and different classifiers show the feasibility and effectiveness of the proposed method in comparison to conventional LE for the classification.

[1]  Xiaoyang Tan,et al.  Pattern Recognition , 2016, Communications in Computer and Information Science.

[2]  Stephen Lin,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  David A. Landgrebe,et al.  Supervised classification in high-dimensional space: geometrical, statistical, and asymptotical properties of multivariate data , 1998, IEEE Trans. Syst. Man Cybern. Part C.

[4]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[5]  Miguel Á. Carreira-Perpiñán,et al.  The Laplacian Eigenmaps Latent Variable Model , 2007, AISTATS.

[6]  Limei Zhang,et al.  Graph-optimized locality preserving projections , 2010, Pattern Recognit..

[7]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[8]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[9]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[10]  Limei Zhang,et al.  An empirical study of two typical locality preserving linear discriminant analysis methods , 2010, Neurocomputing.

[11]  C. Spearman General intelligence Objectively Determined and Measured , 1904 .

[12]  Miguel Á. Carreira-Perpiñán,et al.  Proximity Graphs for Clustering and Manifold Learning , 2004, NIPS.

[13]  Michael C. Hout,et al.  Multidimensional Scaling , 2003, Encyclopedic Dictionary of Archaeology.

[14]  Limei Zhang,et al.  Dimensionality reduction with adaptive graph , 2013, Frontiers of Computer Science.

[15]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[16]  Carl-Fredrik Westin,et al.  Coloring of DT-MRI Fiber Traces Using Laplacian Eigenmaps , 2003, EUROCAST.

[17]  Thomas S. Huang,et al.  Facial expression recognition: A clustering-based approach , 2003, Pattern Recognit. Lett..

[18]  Ian T. Jolliffe,et al.  Principal Component Analysis , 2002, International Encyclopedia of Statistical Science.

[19]  Bogdan Raducanu,et al.  Embedding new observations via sparse-coding for non-linear manifold learning , 2014, Pattern Recognit..

[20]  Shih-Fu Chang,et al.  Graph construction and b-matching for semi-supervised learning , 2009, ICML '09.