Supervised Linear Dimensionality Reduction Based on Factor Analysis and Information-Theoertic Criteria

Recently supervised linear dimensionality reduction method based on Information-Theoretic Criteria has attracted more attention. In existing methods, the first and second order statistics of the data in the original high-dimensional space is estimated, and then the transformation matrix based on the information-theoretic criteria is designed. However, it is difficult to accurately estimate these statistics in the original high-dimensional space, especially in the case of limited data. Thus the transformation matrix obtained may be nonoptimal, which will affect the final classification performance. To solve this problem, a novel supervised linear dimensionality reduction method is proposed in this paper. In our method, the statistical structure of the transformed low-dimensional subspace is described via factor analysis (FA) model, while the mutual information (MI) between the transformed data and their class labels is maximized. Moreover, the joint optimization of MI function and log-likelihood function is achieved, which can not only reduce the estimation errors, but also ensure the separability of the transformed data. Experimental results on benchmark datasets and radar data demonstrate the effectiveness of our method.

[1]  Xuelong Li,et al.  Geometric Mean for Subspace Selection , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  J. E. Jackson A User's Guide to Principal Components , 1991 .

[3]  Soo-Young Lee,et al.  Discriminant Independent Component Analysis , 2011, IEEE Transactions on Neural Networks.

[4]  William S. Rayens,et al.  Independent Component Analysis: Principles and Practice , 2003, Technometrics.

[5]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[6]  Soo-Young Lee,et al.  Discriminant Independent Component Analysis , 2011, IEEE Trans. Neural Networks.

[7]  Seungjin Choi,et al.  Independent Component Analysis , 2009, Handbook of Natural Computing.

[8]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[9]  Zoran Nenadic,et al.  Information Discriminant Analysis: Feature Extraction with an Information-Theoretic Objective , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Hua Yu,et al.  A direct LDA algorithm for high-dimensional data - with application to face recognition , 2001, Pattern Recognit..

[11]  Te-Won Lee,et al.  Independent Component Analysis , 1998, Springer US.

[12]  A. Robert Calderbank,et al.  Communications Inspired Linear Discriminant Analysis , 2012, ICML.

[13]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..