Performance Comparison of Nonlinear Dimensionality Reduction Methods for Image Data Using Different Distance Measures

During recent years a special class of nonlinear dimensionality reduction (NLDR) methods known as manifold learning methods, obtain a lot of attention for low dimension representation of high dimensional data. Most commonly used NLDR methods like Isomap, locally linear embedding, local tangent space alignment, Hessian locally linear embedding, Laplacian eigenmaps and diffusion maps, construct their logic on finding neighborhood points of every data point in high dimension space. These algorithms use Euclidean distance as measurement metric for distance between two data points. In literature different (dis)similarity measures are available for measuring distances between two data points/images. In this paper the authors made a systematic comparative analysis for performance of different NLDR algorithms in reducing high dimensional image data into a low dimensional 2D data using different distance measures. The performance of an algorithm is measured by the fact that how successfully it preserves intrinsic geometry of high dimensional manifold. Visualization of low dimensional data reveals the original structure of high dimensional data.

[1]  Dewen Hu,et al.  Manifold Learning using Growing Locally Linear Embedding , 2007, 2007 IEEE Symposium on Computational Intelligence and Data Mining.

[2]  D. Donoho,et al.  Hessian Eigenmaps : new locally linear embedding techniques for high-dimensional data , 2003 .

[3]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[4]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[5]  Stéphane Lafon,et al.  Diffusion maps , 2006 .

[6]  张振跃,et al.  Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment , 2004 .

[7]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[8]  Dima Damen,et al.  Recognizing linked events: Searching the space of feasible explanations , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[9]  Mark A. Hall,et al.  Correlation-based Feature Selection for Machine Learning , 2003 .

[10]  Joachim M. Buhmann,et al.  Non-parametric similarity measures for unsupervised texture segmentation and image retrieval , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[11]  Junsong Yin,et al.  Growing Locally Linear Embedding for Manifold Learning , 2007 .

[12]  I K Fodor,et al.  A Survey of Dimension Reduction Techniques , 2002 .

[13]  Erkki Oja,et al.  The nonlinear PCA criterion in blind source separation: Relations with other approaches , 1998, Neurocomputing.

[14]  H. Sebastian Seung,et al.  The Manifold Ways of Perception , 2000, Science.

[15]  Leonidas J. Guibas,et al.  The Earth Mover's Distance as a Metric for Image Retrieval , 2000, International Journal of Computer Vision.

[16]  Matti Pietikäinen,et al.  Incremental locally linear embedding , 2005, Pattern Recognit..

[17]  Paul L. Rosin,et al.  Selection of the optimal parameter value for the Isomap algorithm , 2006, Pattern Recognit. Lett..

[18]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[19]  Michael J. Swain,et al.  Color indexing , 1991, International Journal of Computer Vision.