Incremental Manifold Learning Algorithm Using PCA on Overlapping Local Neighborhoods for Dimensionality Reduction

A novel manifold learning algorithm called LPcaML is proposed in this paper. Based on the geometric intuition that d-dimensional manifold locally lies on or close to d-dimensional linear space, LPcaML first finds an *** -TSLN of the whole high-dimensional input data set and then obtains the low-dimensional local coordinates of each neighborhood in the *** -TSLN using classical PCA technique while preserving the local geometric and topological property of each neighborhood. At last LPcaML transforms each local coordinates to a unified global low-dimensional representation by processing each neighborhood in their order appeared in *** -TSLN. And the transformation function of each neighborhood is obtained by solving a least square problem via the overlapped examples. By using the divide and conquer strategy, LPcaML can learn from incremental data and discover the underlying manifold efficiently even if the data set is large scale. Experiments on both synthetic data sets and real face data sets demonstrate the effectiveness of our LPcaML algorithm. Moreover the proposed LPcaML can discover the manifold from sparsely sampled data sets where other manifold learning algorithms can't.

[1]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[2]  Hongyuan Zha,et al.  Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment , 2002, ArXiv.

[3]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[4]  Nicolas Le Roux,et al.  Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering , 2003, NIPS.

[5]  Lawrence K. Saul,et al.  Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifold , 2003, J. Mach. Learn. Res..

[6]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[7]  I. Jolliffe Principal Component Analysis , 2002 .

[8]  Hongbin Zha,et al.  Riemannian Manifold Learning , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  H. Zha,et al.  Principal manifolds and nonlinear dimensionality reduction via tangent space alignment , 2004, SIAM J. Sci. Comput..

[10]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[11]  H. Sebastian Seung,et al.  The Manifold Ways of Perception , 2000, Science.

[12]  Anil K. Jain,et al.  Incremental nonlinear dimensionality reduction by manifold learning , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Serge J. Belongie,et al.  Non-isometric manifold learning: analysis and an algorithm , 2007, ICML '07.

[14]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.