Unsupervised fuzzy multivariate symmetric uncertainty feature selection based on constructing virtual cluster representative

Abstract The data readability, complexity reduction of learning algorithms, and enhancing predictability are the most important reasons for using feature selection methods, especially when there exist lots of features. In recent years, unsupervised feature selection techniques are well explored. Among the various methods of feature selection, algorithms based on information theory are effective in selecting useful attributes and eliminating useless ones. Mutual information and symmetric uncertainty based methods can effectively estimate feature relevancy, but the use of bivariate measures ignores possible dependencies among more than two features. To address this limitation, this research introduces a novel unsupervised feature selection method, called Fuzzy Multivariate Symmetric Uncertainty-Feature Selection (FMSU-FS). The proposed method also overcomes the need for discretization for continuous features that causes information loss. To evaluate the effectiveness of the proposed method, FMSU is compared with conventional methods based on information theory. The experimental results on benchmark datasets show improvement of our approach in measuring clustering performance.

[1]  Daren Yu,et al.  Fuzzy Mutual Information Based min-Redundancy and Max-Relevance Heterogeneous Feature Selection , 2011 .

[2]  Jacob Goldberger,et al.  Unsupervised feature selection based on non-parametric mutual information , 2012, 2012 IEEE International Workshop on Machine Learning for Signal Processing.

[3]  Roberto Battiti,et al.  Using mutual information for selecting features in supervised neural net learning , 1994, IEEE Trans. Neural Networks.

[4]  Omar A. M. Salem,et al.  Ensemble Fuzzy Feature Selection Based on Relevancy, Redundancy, and Dependency Criteria , 2020, Entropy.

[5]  Federico Divina,et al.  A multivariate approach to the symmetrical uncertainty measure: Application to feature selection problem , 2019, Inf. Sci..

[6]  Hadi Zare,et al.  Unsupervised Feature Selection based on Adaptive Similarity Learning and Subspace Clustering , 2019, Eng. Appl. Artif. Intell..

[7]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[8]  Huan Liu,et al.  Feature Selection for Classification: A Review , 2014, Data Classification: Algorithms and Applications.

[9]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[10]  Carsten O. Daub,et al.  The mutual information: Detecting and evaluating dependencies between variables , 2002, ECCB.

[11]  J. Moody,et al.  Feature Selection Based on Joint Mutual Information , 1999 .

[12]  Fuhui Long,et al.  Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Ruoming Jin,et al.  Data discretization unification , 2007, Seventh IEEE International Conference on Data Mining (ICDM 2007).

[14]  Jun Wang,et al.  Feature Selection by Maximizing Independent Classification Information , 2017, IEEE Transactions on Knowledge and Data Engineering.

[15]  Huan Liu,et al.  Discretization: An Enabling Technique , 2002, Data Mining and Knowledge Discovery.

[16]  José Fco. Martínez-Trinidad,et al.  A review of unsupervised feature selection methods , 2019, Artificial Intelligence Review.

[17]  Qinbao Song,et al.  A new unsupervised feature selection algorithm using similarity‐based feature clustering , 2018, Comput. Intell..

[18]  Samuel H. Huang,et al.  Feature selection based on inference correlation , 2011, Intell. Data Anal..

[19]  Pablo A. Estévez,et al.  A review of feature selection methods based on mutual information , 2013, Neural Computing and Applications.

[20]  Ruoming Jin,et al.  Data Discretization Unification , 2007, ICDM.

[21]  F. Fleuret Fast Binary Feature Selection with Conditional Mutual Information , 2004, J. Mach. Learn. Res..

[22]  Christian E. Schaerer,et al.  Feature Selection Using Approximate Multivariate Markov Blankets , 2016, HAIS.

[23]  Chong-Ho Choi,et al.  Input feature selection for classification problems , 2002, IEEE Trans. Neural Networks.