Diffusion map kernel analysis for target classification

Given a high dimensional dataset, one would like to be able to represent this data using fewer parameters while preserving relevant information, previously this was done with principal component analysis, factor analysis, or feature selection. However, if we assume the original data actually exists on a lower dimensional manifold embedded in a high dimensional feature space, then recently popularized approaches based in graph-theory and differential geometry allow us to learn the underlying manifold that generates the data. One such manifold-learning technique, called Diffusion Maps, is said to preserve the local proximity between data points by first constructing a representation for the underlying manifold. This work examines binary target classification problems using Diffusion Maps to embed the data with various kernel representations for the diffusion parameter. Results demonstrate that specific kernels are well suited for Diffusion Map applications on some sonar feature sets and in general certain kernels outperform the standard Gaussian and Polynomial kernels, on several of the higher dimensional data sets including the sonar data contrasting with their performance on the lower-dimensional publically available data sets.

[1]  D. Donoho,et al.  Hessian Eigenmaps : new locally linear embedding techniques for high-dimensional data , 2003 .

[2]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[3]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[4]  John D. Lathrop,et al.  Synthetic images of proud targets , 1994, Proceedings of OCEANS'94.

[5]  Charles K. Chui,et al.  Special issue on diffusion maps and wavelets , 2006 .

[6]  Stéphane Lafon,et al.  Diffusion maps , 2006 .

[7]  Ann B. Lee,et al.  Geometric diffusions as a tool for harmonic analysis and structure definition of data: multiscale methods. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[8]  Anke Meyer-Bäse,et al.  Novel Kernels and Kernel PCA for Pattern Recognition , 2007, 2007 International Symposium on Computational Intelligence in Robotics and Automation.

[9]  Hongyuan Zha,et al.  Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment , 2002, ArXiv.

[10]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.