Convergence and Rate of Convergence of a Manifold-Based Dimension Reduction Algorithm

We study the convergence and the rate of convergence of a local manifold learning algorithm: LTSA [13]. The main technical tool is the perturbation analysis on the linear invariant subspace that corresponds to the solution of LTSA. We derive a worst-case upper bound of errors for LTSA which naturally leads to a convergence result. We then derive the rate of convergence for LTSA in a special case.

[1]  Matthew Brand,et al.  Charting a Manifold , 2002, NIPS.

[2]  V. N. Bogaevski,et al.  Matrix Perturbation Theory , 1991 .

[3]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[4]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[5]  Hongyuan Zha,et al.  Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment , 2002, ArXiv.

[6]  Hongyuan Zha,et al.  Spectral analysis of alignment in manifold learning , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[7]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[8]  X. Huo,et al.  Performance Analysis of a Manifold Learning Algorithm in Dimension Reduction , 2006 .

[9]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[10]  H. Zha,et al.  Principal manifolds and nonlinear dimensionality reduction via tangent space alignment , 2004, SIAM J. Sci. Comput..

[11]  Andrew Korb Smith New results in dimension reduction and model selection , 2008 .

[12]  Hongyuan Zha,et al.  Spectral Properties of the Alignment Matrices in Manifold Learning , 2009, SIAM Rev..

[13]  D. Donoho,et al.  Hessian Eigenmaps : new locally linear embedding techniques for high-dimensional data , 2003 .