Nonlinear Discriminant Analysis on Embedded Manifold

Traditional manifold learning algorithms, such as ISOMAP, LLE, and Laplacian Eigenmap, mainly focus on uncovering the latent low-dimensional geometry structure of the training samples in an unsupervised manner where useful class information is ignored. Therefore, the derived low-dimensional representations are not necessarily optimal in discriminative capability. In this paper, we study the discriminant analysis problem by considering the nonlinear manifold structure of data space. To this end, firstly, a new clustering algorithm, called Intra-Cluster Balanced K-Means (ICBKM), is proposed to partition the samples into multiple clusters while ensure that there are balanced samples for the classes within each cluster; approximately, each cluster can be considered as a local patch on the embedded manifold. Then, the local discriminative projections for different clusters are simultaneously calculated by optimizing the global Fisher Criterion based on the cluster weighted data representation. Compared with traditional linear/kernel discriminant analysis (KDA) algorithms, our proposed algorithm has the following characteristics: 1) it essentially is a KDA algorithm with specific geometry-adaptive-kernel tailored to the specific data structure, in contrast to traditional KDA in which the kernel is fixed and independent to the data set; 2) it is approximately a locally linear while globally nonlinear discriminant analyzer; 3) it does not need to store the original samples for computing the low-dimensional representation of a new data; and 4) it is computationally efficient compared with traditional KDA when the sample number is large. The toy problem on artificial data demonstrates the effectiveness of our proposed algorithm in deriving discriminative representations for problems with nonlinear classification hyperplane. The face recognition experiments on YALE and CMU PIE databases show that our proposed algorithm significantly outperforms linear discriminant analysis (LDA) as well as Mixture LDA, and has higher accuracy than KDA with traditional kernels

[1]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[2]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[3]  Stephen M. Omohundro,et al.  Nonlinear manifold learning for visual speech recognition , 1995, Proceedings of IEEE International Conference on Computer Vision.

[4]  Fan Chung,et al.  Spectral Graph Theory , 1996 .

[5]  Alex Pentland,et al.  Probabilistic Visual Learning for Object Representation , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  David J. Kriegman,et al.  Recognition using class specific linear projection , 1997 .

[7]  Jitendra Malik,et al.  Normalized cuts and image segmentation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[8]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[9]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[10]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[11]  G. Baudat,et al.  Generalized Discriminant Analysis Using a Kernel Approach , 2000, Neural Computation.

[12]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[13]  H. Sebastian Seung,et al.  The Manifold Ways of Perception , 2000, Science.

[14]  Geoffrey E. Hinton,et al.  Global Coordination of Local Linear Models , 2001, NIPS.

[15]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[16]  Aleksandra Mojsilovic,et al.  A Variational Approach to Recovering a Manifold from Sample Points , 2002, ECCV.

[17]  David Wai-Lok Cheung,et al.  Effect of Data Skewness and Workload Balance in Parallel Data Mining , 2002, IEEE Trans. Knowl. Data Eng..

[18]  Geoffrey E. Hinton,et al.  Stochastic Neighbor Embedding , 2002, NIPS.

[19]  P. Niyogi,et al.  Locality Preserving Projections (LPP) , 2002 .

[20]  Stephen A. Billings,et al.  Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm , 2002, Neural Networks.

[21]  Kiyoshi Asai,et al.  Marginalized kernels for biological sequences , 2002, ISMB.

[22]  Daniel Freedman,et al.  Efficient Simplicial Reconstructions of Manifolds from Their Samples , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[23]  Ming-Hsuan Yang,et al.  Kernel Eigenfaces vs. Kernel Fisherfaces: Face recognition using kernel methods , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[24]  Yee Whye Teh,et al.  Automatic Alignment of Local Representations , 2002, NIPS.

[25]  Qingshan Liu,et al.  Face recognition using kernel based fisher discriminant analysis , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[26]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression (PIE) database , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[27]  Matthew Brand,et al.  Charting a Manifold , 2002, NIPS.

[28]  Konstantinos N. Plataniotis,et al.  Face recognition using kernel direct discriminant analysis algorithms , 2003, IEEE Trans. Neural Networks.

[29]  Nicolas Le Roux,et al.  Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering , 2003, NIPS.

[30]  Jian Yang,et al.  KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  Josef Kittler,et al.  Locally linear discriminant analysis for multimodally distributed classes for face recognition with a single model image , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.