On an Extended Fisher Criterion for Feature Selection

This correspondence considers the extraction of features as a task of linear transformation of an initial pattern space into a new space, optimal with respect to discriminating the data. A solution of the feature extraction problem is given for two multivariate normal distributed pattern classes using an extended Fisher criterion as the distance measure. The introduced distance measure consists of two terms. The first term estimates the distance between classes upon the difference of mean vectors of classes and the second one upon the difference of class covariance matrices. The proposed method is compared to some of the more popular alternative methods: Fukunaga-Koontz method and Foley-Sammon method.

[1]  R. F.,et al.  Mathematical Statistics , 1944, Nature.

[2]  King-Sun Fu,et al.  Selection and Ordering of Feature Observations in a Pattern Recognition System , 1968, Inf. Control..

[3]  Josef Kittler On the Discriminant Vector Method of Feature Selection , 1977, IEEE Transactions on Computers.

[4]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[5]  Keinosuke Fukunaga,et al.  Application of the Karhunen-Loève Expansion to Feature Selection and Ordering , 1970, IEEE Trans. Computers.

[6]  John W. Sammon,et al.  An Optimal Set of Discriminant Vectors , 1975, IEEE Transactions on Computers.

[7]  Peter Lancaster,et al.  The theory of matrices , 1969 .

[8]  John W. Sammon,et al.  An Optimal Discriminant Plane , 1970, IEEE Transactions on Computers.

[9]  Demetrios Kazakos,et al.  Maximin Linear Discrimination, I , 1977, IEEE Transactions on Systems, Man, and Cybernetics.

[10]  Keinosuke Fukunaga,et al.  A Criterion and an Algorithm for Grouping Data , 1970, IEEE Transactions on Computers.

[11]  H. P. Friedman,et al.  On Some Invariant Criteria for Grouping Data , 1967 .