Pattern Classification Using Composite Features

In this paper, we propose a new classification method using composite features, each of which consists of a number of primitive features. The covariance of two composite features contains information on statistical dependency among multiple primitive features. A new discriminant analysis (C-LDA) using the covariance of composite features is a generalization of the linear discriminant analysis (LDA). Unlike LDA, the number of extracted features can be larger than the number of classes in C-LDA. Experimental results on several data sets indicate that C-LDA provides better classification results than other methods.

[1]  Chong-Ho Choi,et al.  Combined subspace method using global and local features for face recognition , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[2]  Kar-Ann Toh,et al.  Benchmarking a reduced multivariate polynomial pattern classifier , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[4]  Jieping Ye,et al.  A two-stage linear discriminant analysis via QR-decomposition , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Chong-Ho Choi,et al.  Input feature selection for classification problems , 2002, IEEE Trans. Neural Networks.

[6]  Cor J. Veenman,et al.  The nearest subclass classifier: a compromise between the nearest mean and nearest neighbor classifier , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Keinosuke Fukunaga,et al.  Bayes Error Estimation Using Parzen and k-NN Procedures , 1987, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Robert P. W. Duin,et al.  Linear dimensionality reduction via a heteroscedastic extension of LDA: the Chernoff criterion , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[10]  Alex Pentland,et al.  Probabilistic Visual Learning for Object Representation , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[12]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[13]  C. H. Chen,et al.  On information and distance measures, error bounds, and feature selection , 1976, Information Sciences.

[14]  Jian Yang,et al.  From image vector to matrix: a straightforward image projection technique - IMPCA vs. PCA , 2002, Pattern Recognit..

[15]  Jonny Eriksson,et al.  Feature reduction for classification of multidimensional data , 2000, Pattern Recognit..

[16]  Kenneth Steiglitz,et al.  Combinatorial Optimization: Algorithms and Complexity , 1981 .

[17]  K. Fukunaga,et al.  Nonparametric Discriminant Analysis , 1983, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Hakan Cevikalp,et al.  Discriminative common vectors for face recognition , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Andrew R. Webb,et al.  Statistical Pattern Recognition , 1999 .

[20]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[21]  Anil K. Jain,et al.  Statistical Pattern Recognition: A Review , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Jian Yang,et al.  Two-dimensional discriminant transform for face recognition , 2005, Pattern Recognit..