An Explicit Sparse Mapping for Nonlinear Dimensionality Reduction

A disadvantage of most nonlinear dimensionality reduction methods is that there are no explicit mappings to project high-dimensional features into low-dimensional representation space. Previously, some methods have been proposed to provide explicit mappings for nonlinear dimensionality reduction methods. Nevertheless, a disadvantage of these methods is that the learned mapping functions are combinations of all the original features, thus it is often difficult to interpret the results. In addition, the dense projection matrices of these approaches will cause a high cost of storage and computation. In this paper, a framework based on L1-norm regularization is presented to learn explicit sparse polynomial mappings for nonlinear dimensionality reduction. By using this framework and the method of locally linear embedding, we derive an explicit sparse nonlinear dimensionality reduction algorithm, which is named sparse neighborhood preserving polynomial embedding. Experimental results on real world classification and clustering problems demonstrate the effectiveness of our approach.

[1]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[2]  Jiawei Han,et al.  Spectral Regression: A Unified Approach for Sparse Subspace Learning , 2007, Seventh IEEE International Conference on Data Mining (ICDM 2007).

[3]  Trevor Hastie,et al.  Regularization Paths for Generalized Linear Models via Coordinate Descent. , 2010, Journal of statistical software.

[4]  Hong Qiao,et al.  An Explicit Nonlinear Mapping for Manifold Learning , 2010, IEEE Transactions on Cybernetics.

[5]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[6]  Aleix M. Martinez,et al.  The AR face database , 1998 .

[7]  Kilian Q. Weinberger,et al.  Unsupervised Learning of Image Manifolds by Semidefinite Programming , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[8]  I. Jolliffe Principal Component Analysis , 2002 .

[9]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[10]  R. Tibshirani,et al.  Sparse Principal Component Analysis , 2006 .

[11]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[12]  A. Martínez,et al.  The AR face databasae , 1998 .

[13]  Shuicheng Yan,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007 .

[14]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression Database , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.