Orthogonal Projection Analysis

In this paper, we propose a novel linear dimensionality reduction algorithm, called Orthogonal Projection Analysis (OPA), from a gradient field perspective. Our approach is based on the following two criteria. First, the linear map should preserve the metric of the ambient space, which is based on the assumption that the metric of the ambient space is reliable. The second is the well-known smoothness criterion which is critical for clustering. Interestingly, gradient field is a natural tool to connect to these two requirements. We give a continuous objective function based on gradient fields and discuss how to discretize it by using tangent space. We also show the geometric meaning of our approach, which is requiring the gradient field as orthogonal as possible to the tangent spaces. The experimental results have demonstrated the effectiveness of our proposed approach.

[1]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[2]  Alon Zakai,et al.  Manifold Learning: The Price of Normalization , 2008, J. Mach. Learn. Res..

[3]  Gene H. Golub,et al.  Matrix computations (3rd ed.) , 1996 .

[4]  Bernhard Schölkopf,et al.  A kernel view of the dimensionality reduction of manifolds , 2004, ICML.

[5]  David G. Stork,et al.  Pattern Classification , 1973 .

[6]  Kilian Q. Weinberger,et al.  Learning a kernel matrix for nonlinear dimensionality reduction , 2004, ICML.

[7]  Ann B. Lee,et al.  Diffusion maps and coarse-graining: a unified framework for dimensionality reduction, graph partitioning, and data set parameterization , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[9]  Matthew Brand,et al.  Charting a Manifold , 2002, NIPS.

[10]  David G. Stork,et al.  Pattern Classification (2nd ed.) , 1999 .

[11]  Hongbin Zha,et al.  Riemannian Manifold Learning , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  H. Zha,et al.  Principal manifolds and nonlinear dimensionality reduction via tangent space alignment , 2004, SIAM J. Sci. Comput..

[13]  Hongyuan Zha,et al.  Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment , 2002, ArXiv.

[14]  B. Nadler,et al.  Diffusion maps, spectral clustering and reaction coordinates of dynamical systems , 2005, math/0503445.

[15]  Serge J. Belongie,et al.  Non-isometric manifold learning: analysis and an algorithm , 2007, ICML '07.

[16]  I. Jolliffe Principal Component Analysis , 2002 .

[17]  Ronald R. Coifman,et al.  Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators , 2005, NIPS.

[18]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[19]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[20]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[21]  Yin Hujun Advances in adaptive nonlinear manifolds and dimensionality reduction , 2011 .