Standard unsupervised feature extraction methods such as PCA and ICA provide representative features and latent variables which minimizes the data reconstruction error. These generative features may be common to all data, and may not be optimal for classification tasks. The discriminate ICA (dICA) and discriminant NMF (dNMF) had recently been proposed which jointly maximizes Fisher linear discriminant and Negentropy of the extracted features. Motivated by independence among features and modified Fisher linear discriminant, the new algorithm extracts features with both generative and discriminant powers. Then, the features are further fine-tuned by supervised learning. Experimental results show excellent recognition performance with these features.
[1]
Yiming Yang,et al.
A Comparative Study on Feature Selection in Text Categorization
,
1997,
ICML.
[2]
H. Sebastian Seung,et al.
Learning the parts of objects by non-negative matrix factorization
,
1999,
Nature.
[3]
Te-Won Lee,et al.
Independent Component Analysis
,
1998,
Springer US.
[4]
Michael W. Berry,et al.
Document clustering using nonnegative matrix factorization
,
2006,
Inf. Process. Manag..
[5]
Isabelle Guyon,et al.
An Introduction to Variable and Feature Selection
,
2003,
J. Mach. Learn. Res..