Local tangent space alignment based on Hilbert–Schmidt independence criterion regularization

Local tangent space alignment (LTSA) is a famous manifold learning algorithm, and many other manifold learning algorithms are developed based on LTSA. However, from the viewpoint of dimensionality reduction, LTSA is only a local feature preserving algorithm. What the community of dimensionality reduction is now pursuing are those algorithms capable of preserving both local and global features at the same time. In this paper, a new algorithm for dimensionality reduction, called HSIC-regularized LTSA (HSIC–LTSA), is proposed, in which a HSIC regularization term is added to the objective function of LTSA. HSIC is an acronym for Hilbert–Schmidt independence criterion and has been used in many applications of machine learning. However, HSIC has not been directly applied to dimensionality reduction so far, neither used as a regularization term to combine with other machine learning algorithms. Therefore, the proposed HSIC–LTSA is a new try for both HSIC and LTSA. In HSIC–LTSA, HSIC makes the high- and low-dimensional data statistically correlative as much as possible, while LTSA reduces the data dimension under the local homeomorphism-preserving criterion. The experimental results presented in this paper show that, on several commonly used datasets, HSIC–LTSA performs better than LTSA as well as some state-of-the-art local and global preserving algorithms.

[1]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[2]  D. Jude Hemanth,et al.  ABC algorithm based optimization of 1-D hidden Markov model for hand gesture recognition applications , 2018, Comput. Ind..

[3]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[4]  M. Spivak A comprehensive introduction to differential geometry , 1979 .

[5]  J. Jost Riemannian geometry and geometric analysis , 1995 .

[6]  Stephen Lin,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Nicolas Courty,et al.  Sparse Hilbert Schmidt Independence Criterion and Surrogate-Kernel-Based Feature Selection for Hyperspectral Image Classification , 2017, IEEE Transactions on Geoscience and Remote Sensing.

[8]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2003, ICTAI.

[9]  Mehrdad J. Gangeh,et al.  Fast and Scalable Feature Selection for Gene Expression Data Using Hilbert-Schmidt Independence Criterion , 2017, IEEE/ACM Transactions on Computational Biology and Bioinformatics.

[10]  W. Boothby An introduction to differentiable manifolds and Riemannian geometry , 1975 .

[11]  Zhengming Ma,et al.  Local Coordinates Alignment With Global Preservation for Dimensionality Reduction , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[12]  Eric O. Postma,et al.  Dimensionality Reduction: A Comparative Review , 2008 .

[13]  Ivor W. Tsang,et al.  Incorporating the Loss Function Into Discriminative Clustering of Structured Outputs , 2010, IEEE Transactions on Neural Networks.

[14]  Bernhard Schölkopf,et al.  Measuring Statistical Dependence with Hilbert-Schmidt Norms , 2005, ALT.

[15]  Zohreh Azimifar,et al.  Supervised principal component analysis: Visualization, classification and regression on subspaces and submanifolds , 2011, Pattern Recognit..

[16]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[17]  K. Martin Sagayam,et al.  Optimization of Hand Motion Recognition System Based on 2D HMM Approach Using ABC Algorithm , 2017 .

[18]  Jiawei Han,et al.  Orthogonal Laplacianfaces for Face Recognition , 2006, IEEE Transactions on Image Processing.

[19]  Yuxiao Hu,et al.  Face recognition using Laplacianfaces , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  Min Xiao,et al.  Feature Space Independent Semi-Supervised Domain Adaptation via Kernel Matching , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  Feiping Nie,et al.  Regression Reformulations of LLE and LTSA With Locally Linear Transformation , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[22]  K. Martin Sagayam,et al.  Comparative Analysis of 1-D HMM and 2-D HMM for Hand Motion Recognition Applications , 2018 .

[23]  Nenghai Yu,et al.  Neighborhood Preserving Projections (NPP): A Novel Linear Dimension Reduction Method , 2005, ICIC.

[24]  Mohamed S. Kamel,et al.  Kernelized Supervised Dictionary Learning , 2012, IEEE Transactions on Signal Processing.

[25]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[26]  GhodsiAli,et al.  Fast and Scalable Feature Selection for Gene Expression Data Using Hilbert-Schmidt Independence Criterion , 2017 .

[27]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[28]  David Zhang,et al.  Learning Domain-Invariant Subspace Using Domain Features and Independence Maximization , 2016, IEEE Transactions on Cybernetics.

[29]  Fakhri Karray,et al.  Multiview Supervised Dictionary Learning in Speech Emotion Recognition , 2014, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[30]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[31]  Hong Qiao,et al.  An Explicit Nonlinear Mapping for Manifold Learning , 2010, IEEE Transactions on Cybernetics.

[32]  张振跃,et al.  Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment , 2004 .

[33]  Yousef Saad,et al.  Orthogonal Neighborhood Preserving Projections: A Projection-Based Dimensionality Reduction Technique , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[34]  E. Kreyszig Introductory Functional Analysis With Applications , 1978 .

[35]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[36]  Kilian Q. Weinberger,et al.  Learning a kernel matrix for nonlinear dimensionality reduction , 2004, ICML.

[37]  Israel Gohberg,et al.  Hilbert-Schmidt Operators , 1990 .

[38]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[39]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[40]  Lawrence K. Saul,et al.  Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifold , 2003, J. Mach. Learn. Res..

[41]  Lei Wang,et al.  Global and Local Structure Preservation for Feature Selection , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[42]  Ann B. Lee,et al.  Diffusion maps and coarse-graining: a unified framework for dimensionality reduction, graph partitioning, and data set parameterization , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.