Applying inverse stereographic projection to manifold learning and clustering

In machine learning, a data set is often viewed as a point set distributed on a manifold. Using Euclidean norms to measure the proximity of this data set reduces the efficiency of learning methods. Also, many algorithms like Laplacian Eigenmaps or spectral clustering that require to measure similarity assume the k-Nearest Neighbors of any point are quite equal to the local neighborhood of the point on the manifold using Euclidean norms. In this paper, we propose a new method that intelligently transforms data on an unknown manifold to an n-sphere by the conformal stereographic projection, which preserves the angles and similarities of data in the original manifold. Therefore similarities represent actual similarities of the data in the original space. Experimental results on various problems, including clustering and manifold learning, show the effectiveness of our method.

[1]  Qun Jin,et al.  Manifold alignment using discrete surface Ricci flow , 2016, CAAI Trans. Intell. Technol..

[2]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[3]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[4]  Shiguang Shan,et al.  Geometry-Aware Similarity Learning on SPD Manifolds for Visual Recognition , 2016, IEEE Transactions on Circuits and Systems for Video Technology.

[5]  Gérard G. Medioni,et al.  Manifold Learning , 2009, Encyclopedia of Biometrics.

[6]  Yuxiao Hu,et al.  Face recognition using Laplacianfaces , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  N. Ayache,et al.  Log‐Euclidean metrics for fast and simple calculus on diffusion tensors , 2006, Magnetic resonance in medicine.

[8]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[9]  Zhengming Ma,et al.  Kernel-Based Subspace Learning on Riemannian Manifolds for Visual Recognition , 2019, Neural Processing Letters.

[10]  Ulrike von Luxburg,et al.  A tutorial on spectral clustering , 2007, Stat. Comput..

[11]  Mehrtash Tafazzoli Harandi,et al.  From Manifold to Manifold: Geometry-Aware Dimensionality Reduction for SPD Matrices , 2014, ECCV.

[12]  H. Zha,et al.  Principal manifolds and nonlinear dimensionality reduction via tangent space alignment , 2004, SIAM J. Sci. Comput..

[13]  Mehrtash Harandi,et al.  Dimensionality Reduction on SPD Manifolds: The Emergence of Geometry-Aware Methods , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Zenglin Xu,et al.  Clustering with Similarity Preserving , 2019, Neurocomputing.

[15]  Stéphane Mallat,et al.  Matching pursuits with time-frequency dictionaries , 1993, IEEE Trans. Signal Process..

[16]  Yangyang Li,et al.  Applying Ricci flow to high dimensional manifold learning , 2018, Science China Information Sciences.

[17]  Y. Jiang,et al.  Spectral Clustering on Multiple Manifolds , 2011, IEEE Transactions on Neural Networks.

[18]  I. Holopainen Riemannian Geometry , 1927, Nature.

[19]  J. Ratcliffe Foundations of Hyperbolic Manifolds , 2019, Graduate Texts in Mathematics.

[20]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[21]  Jiancheng Lv,et al.  Angle-based embedding quality assessment method for manifold learning , 2019, Neural Computing and Applications.

[22]  Philip S. Yu,et al.  Spectral clustering for multi-type relational data , 2006, ICML.

[23]  Pietro Perona,et al.  Self-Tuning Spectral Clustering , 2004, NIPS.

[24]  Jiwu Huang,et al.  Riemannian competitive learning for symmetric positive definite matrices clustering , 2018, Neurocomputing.

[25]  Michael Tschannen,et al.  Noisy Subspace Clustering via Matching Pursuits , 2018, IEEE Transactions on Information Theory.

[26]  Slawomir T. Wierzchon,et al.  Spectral Cluster Maps Versus Spectral Clustering , 2020, CISIM.

[27]  Jing Wang,et al.  MLLE: Modified Locally Linear Embedding Using Multiple Weights , 2006, NIPS.

[28]  Hongbin Zha,et al.  Riemannian Manifold Learning for Nonlinear Dimensionality Reduction , 2006, ECCV.

[29]  Weiling Cai A manifold learning framework for both clustering and classification , 2015, Knowl. Based Syst..

[30]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[31]  Jitendra Malik,et al.  Normalized Cuts and Image Segmentation , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[32]  Ezzeddine Zagrouba,et al.  Multi-kernel sparse subspace clustering on the Riemannian manifold of symmetric positive definite matrices , 2019, Pattern Recognit. Lett..

[33]  P. Thomas Fletcher,et al.  Riemannian geometry for the statistical analysis of diffusion tensor data , 2007, Signal Process..

[34]  Zhao Kang,et al.  Kernel-driven similarity learning , 2017, Neurocomputing.

[35]  L. Tu An introduction to manifolds , 2007 .

[36]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[37]  Edwin R. Hancock,et al.  Spherical and Hyperbolic Embeddings of Data , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[38]  Bo Jiang,et al.  Robust data representation using locally linear embedding guided PCA , 2018, Neurocomputing.

[39]  Jun Jiao,et al.  Image Clustering via Sparse Representation , 2010, MMM.

[40]  René Vidal,et al.  Sparse Manifold Clustering and Embedding , 2011, NIPS.

[41]  J. Friedman,et al.  Projection Pursuit Regression , 1981 .