Incremental manifold learning by spectral embedding methods

Recent years have witnessed great success of manifold learning methods in understanding the structure of multidimensional patterns. However, most of these methods operate in a batch mode and cannot be effectively applied when data are collected sequentially. In this paper, we propose a general incremental learning framework, capable of dealing with one or more new samples each time, for the so-called spectral embedding methods. In the proposed framework, the incremental dimensionality reduction problem reduces to an incremental eigen-problem of matrices. Furthermore, we present, using this framework as a tool, an incremental version of Hessian eigenmaps, the IHLLE method. Finally, we show several experimental results on both synthetic and real world datasets, demonstrating the efficiency and accuracy of the proposed algorithm.

[1]  E. M. Wright,et al.  Adaptive Control Processes: A Guided Tour , 1961, The Mathematical Gazette.

[2]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[3]  Hujun Yin,et al.  ViSOM - a novel method for multivariate data projection and structure visualization , 2002, IEEE Trans. Neural Networks.

[4]  Nicolas Le Roux,et al.  Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering , 2003, NIPS.

[5]  Feiping Nie,et al.  Nonlinear Dimensionality Reduction with Local Spline Embedding , 2009, IEEE Transactions on Knowledge and Data Engineering.

[6]  John Langford,et al.  Cover trees for nearest neighbor , 2006, ICML.

[7]  Ian T. Jolliffe,et al.  Principal Component Analysis , 2002, International Encyclopedia of Statistical Science.

[8]  Jianwei Yin,et al.  Incremental Manifold Learning Via Tangent Space Alignment , 2006, ANNPR.

[9]  Trevor F. Cox,et al.  Multidimensional Scaling, Second Edition , 2000 .

[10]  Gene H. Golub,et al.  Matrix computations , 1983 .

[11]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[12]  G. Stewart Accelerating the orthogonal iteration for the eigenvectors of a Hermitian matrix , 1969 .

[13]  Hongbin Zha,et al.  Riemannian Manifold Learning , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  YANQING CHEN,et al.  Algorithm 8 xx : CHOLMOD , supernodal sparse Cholesky factorization and update / downdate ∗ , 2006 .

[15]  H. Zha,et al.  Principal manifolds and nonlinear dimensionality reduction via tangent space alignment , 2004, SIAM J. Sci. Comput..

[16]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[17]  Hujun Yin,et al.  On multidimensional scaling and the embedding of self-organising maps , 2008, Neural Networks.

[18]  Hugh F. Durrant-Whyte,et al.  Sequential nonlinear manifold learning , 2007, Intell. Data Anal..

[19]  Kilian Q. Weinberger,et al.  Unsupervised Learning of Image Manifolds by Semidefinite Programming , 2004, CVPR.

[20]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[21]  Martin Berggren,et al.  Hybrid differentiation strategies for simulation and analysis of applications in C++ , 2008, TOMS.

[22]  Lawrence K. Saul,et al.  Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifold , 2003, J. Mach. Learn. Res..

[23]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[24]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[25]  Matti Pietikäinen,et al.  Incremental locally linear embedding , 2005, Pattern Recognit..

[26]  Teuvo Kohonen,et al.  Self-Organizing Maps, Third Edition , 2001, Springer Series in Information Sciences.

[27]  Gene H. Golub,et al.  Matrix computations (3rd ed.) , 1996 .

[28]  Amr M. Youssef,et al.  Incremental Hessian Locally Linear Embedding algorithm , 2007, 2007 9th International Symposium on Signal Processing and Its Applications.

[29]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[30]  Wenhua Wang,et al.  Local and Global Regressive Mapping for Manifold Learning with Out-of-Sample Extrapolation , 2010, AAAI.

[31]  I. Jolliffe Principal Component Analysis , 2002 .

[32]  David G. Stork,et al.  Pattern Classification , 1973 .

[33]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[34]  Hongyuan Zha,et al.  Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment , 2002, ArXiv.

[35]  Dewen Hu,et al.  Incremental Laplacian eigenmaps by preserving adjacent information between data points , 2009, Pattern Recognit. Lett..

[36]  Tat-Jun Chin,et al.  Out-of-Sample Extrapolation of Learned Manifolds , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  Feiping Nie,et al.  Embedding new data points for manifold learning via coordinate propagation , 2007, Knowledge and Information Systems.

[38]  Anil K. Jain,et al.  Incremental nonlinear dimensionality reduction by manifold learning , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[39]  Christopher M. Bishop,et al.  GTM: The Generative Topographic Mapping , 1998, Neural Computation.

[40]  Stephen Lin,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.