On the Theoretical and Computational Analysis between SDA and Lap-LDA

Semi-supervised dimensionality reduction is an important research topic in many pattern recognition and machine learning applications. Among all the methods for semi-supervised dimensionality reduction, SDA and Lap-LDA are two popular ones. Both SDA and Lap-LDA can perform dimensionality reduction by preserving the discriminative structure embedding in the labeled samples as well as the manifold structure embedded both in labeled and unlabeled samples. But they apply different schemes for semi-supervised dimensionality reduction. SDA has added the manifold term to the objective function of LDA while Lap-LDA has added such term to the objective function of Least Square with certain class indicator. In this paper, we further analyze the schemes of two methods and build the equivalence between them by giving a certain condition. We then show their difference when the certain condition cannot be satisfied. Extensive simulations have been conducted based several datasets. Both theoretical analysis and simulation results confirm the analysis. Finally, motivated by the equivalence and differences between two methods, we then propose an improved approach for semi-supervised dimensionality reduction. The proposed approach is actually a two-stage approach and can obtain the optimal solution equivalent to Lap-LDA (in this first stage) and SDA (in the second stage) with less computational cost.

[1]  Jieping Ye,et al.  Least squares linear discriminant analysis , 2007, ICML '07.

[2]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[3]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[4]  Jieping Ye,et al.  A scalable two-stage approach for a class of dimensionality reduction techniques , 2010, KDD.

[5]  Jiawei Han,et al.  Spectral regression: a unified subspace learning framework for content-based image retrieval , 2007, ACM Multimedia.

[6]  Zhihua Zhang,et al.  A Flexible and Efficient Algorithm for Regularized Fisher Discriminant Analysis , 2009, ECML/PKDD.

[7]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[8]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[9]  Jonathan J. Hull,et al.  A Database for Handwritten Text Recognition Research , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  R. Tibshirani,et al.  Efficient quadratic regularization for expression arrays. , 2004, Biostatistics.

[11]  Jiawei Han,et al.  Semi-supervised Discriminant Analysis , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[12]  Jieping Ye,et al.  Integrating Global and Local Structures: A Least Squares Framework for Dimensionality Reduction , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[13]  Haesun Park,et al.  Generalizing discriminant analysis using the generalized singular value decomposition , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Yuxiao Hu,et al.  Face recognition using Laplacianfaces , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  David J. Kriegman,et al.  From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  D. B. Graham,et al.  Characterising Virtual Eigensignatures for General Purpose Face Recognition , 1998 .

[17]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression Database , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[18]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[19]  Sameer A. Nene,et al.  Columbia Object Image Library (COIL100) , 1996 .

[20]  D. B. Gerham Characterizing virtual eigensignatures for general purpose face recognition , 1998 .

[21]  Michael A. Saunders,et al.  LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares , 1982, TOMS.

[22]  Stephen Lin,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Jieping Ye,et al.  Characterization of a Family of Algorithms for Generalized Discriminant Analysis on Undersampled Problems , 2005, J. Mach. Learn. Res..