A Discriminant Analysis for Undersampled Data

One of the inherent problems in pattern recognition is the undersampled data problem, also known as the curse of dimensionality reduction. In this paper a new algorithm called pairwise discriminant analysis (PDA) is proposed for pattern recognition. PDA, like linear discriminant analysis (LDA), performs dimensionality reduction and clustering, without suffering from undersampled data to the same extent as LDA.

[1]  Anil K. Jain,et al.  Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Daniel D. Lee,et al.  Learning nonlinear appearance manifolds for robot localization , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Richard C. T. Lee,et al.  Application of Principal Component Analysis to Multikey Searching , 1976, IEEE Transactions on Software Engineering.

[4]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[5]  David G. Stork,et al.  Pattern Classification (2nd ed.) , 1999 .

[6]  Wei-Ying Ma,et al.  Learning an image manifold for retrieval , 2004, MULTIMEDIA '04.

[7]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[8]  Lydia E. Kavraki,et al.  A dimensionality reduction approach to modeling protein flexibility , 2002, RECOMB '02.

[9]  Kilian Q. Weinberger,et al.  An Introduction to Nonlinear Dimensionality Reduction by Maximum Variance Unfolding , 2006, AAAI.

[10]  Dao-Qing Dai,et al.  Improved discriminate analysis for high-dimensional data and its application to face recognition , 2007, Pattern Recognit..

[11]  Deli Zhao,et al.  Linear Laplacian Discrimination for Feature Extraction , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[12]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[13]  Junbin Gao,et al.  Visualization of Non-vectorial Data Using Twin Kernel Embedding , 2006, 2006 International Workshop on Integrating AI and Data Mining.

[14]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[15]  David G. Stork,et al.  Pattern Classification , 1973 .

[16]  H.H. Yue,et al.  Weighted principal component analysis and its applications to improve FDC performance , 2004, 2004 43rd IEEE Conference on Decision and Control (CDC) (IEEE Cat. No.04CH37601).

[17]  Eric O. Postma,et al.  Dimensionality Reduction: A Comparative Review , 2008 .

[18]  Shigeo Abe DrEng Pattern Classification , 2001, Springer London.

[19]  Bernhard Schölkopf,et al.  Local learning projections , 2007, ICML '07.

[20]  Oleg Okun,et al.  Fast Non-negative Dimensionality Reduction for Protein Fold Recognition , 2005, ECML.

[21]  Katsuhiko Sakaue,et al.  Multi-view face recognition by nonlinear dimensionality reduction and generalized linear models , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[22]  Juyang Weng,et al.  Using Discriminant Eigenfeatures for Image Retrieval , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[23]  Dahua Lin,et al.  Recognize High Resolution Faces: From Macrocosm to Microcosm , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[24]  Jieping Ye,et al.  A two-stage linear discriminant analysis via QR-decomposition , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  John K. Tsotsos,et al.  Face recognition with weighted locally linear embedding , 2005, The 2nd Canadian Conference on Computer and Robot Vision (CRV'05).

[26]  Cheng Wang,et al.  Modified Principal Component Analysis (MPCA) for feature selection of hyperspectral imagery , 2003, IGARSS 2003. 2003 IEEE International Geoscience and Remote Sensing Symposium. Proceedings (IEEE Cat. No.03CH37477).