Parallel Nonlinear Discriminant Feature Extraction for Face and Handwritten Digit Recognition

For recognition tasks with large amounts of data, the nonlinear discriminant feature extraction technique often suffers from large computational burden. Although some nonlinear accelerating methods have been presented, how to greatly reduce computing time and simultaneously keep favorable recognition result is still challenging. In this paper, we introduce parallel computing into nonlinear subspace learning and build a parallel nonlinear discriminant feature extraction framework. We firstly design a random non-overlapping equal data division strategy to divide the whole training sample set into several subsets and assign each computational node a subset. Then we separately learn nonlinear discriminant subspaces from these subsets without mutual communications, and finally select the most appropriate subspace for classification. Under this framework, we propose a novel nonlinear subspace learning approach, i.e., parallel nonlinear discriminant analysis(PNDA). Experimental results on three public face and handwritten digit image databases demonstrate the efficiency and effectiveness of the proposed approach.

[1]  Peng Li,et al.  CUDA Implementation of Deformable Pattern Recognition and its Application to MNIST Handwritten Digit Database , 2010, 2010 20th International Conference on Pattern Recognition.

[2]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2003, ICTAI.

[3]  Yu Peng,et al.  Quasiconformal kernel common locality discriminant analysis with application to breast cancer diagnosis , 2013, Inf. Sci..

[4]  G. Baudat,et al.  Generalized Discriminant Analysis Using a Kernel Approach , 2000, Neural Computation.

[5]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[6]  Jiawei Han,et al.  Speed up kernel discriminant analysis , 2011, The VLDB Journal.

[7]  Xuelong Li,et al.  Discriminative Orthogonal Neighborhood-Preserving Projections for Classification , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[8]  Nitesh V. Chawla,et al.  Learning From Labeled And Unlabeled Data: An Empirical Study Across Techniques And Domains , 2011, J. Artif. Intell. Res..

[9]  Xin Yao,et al.  Sparse Approximation Through Boosting for Learning Large Scale Kernel Machines , 2010, IEEE Transactions on Neural Networks.

[10]  David J. Kriegman,et al.  Acquiring linear subspaces for face recognition under variable lighting , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Xingquan Zhu,et al.  Accelerated Kernel Feature Analysis , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[12]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[13]  Xiaoou Tang,et al.  A Rank-One Update Algorithm for Fast Solving Kernel Foley–Sammon Optimal Discriminant Vectors , 2010, IEEE Transactions on Neural Networks.

[14]  Jiashu Zhang,et al.  Linear Discriminant Analysis Based on L1-Norm Maximization , 2013, IEEE Transactions on Image Processing.

[15]  David Zhang,et al.  A fast kernel-based nonlinear discriminant analysis for multi-class problems , 2006, Pattern Recognit..

[16]  Jiansheng Fu,et al.  Distributed kernel Fisher discriminant analysis for radar image recognition , 2011, 2011 Second International Conference on Mechanic Automation and Control Engineering.

[17]  Arne Leijon,et al.  Beta mixture models and the application to image classification , 2009, 2009 16th IEEE International Conference on Image Processing (ICIP).