A Feature Selection Algorithm Based on Equal Interval Division and Minimal-Redundancy–Maximal-Relevance

Minimal-redundancy–maximal-relevance (mRMR) algorithm is a typical feature selection algorithm. To select the feature which has minimal redundancy with the selected features and maximal relevance with the class label, the objective function of mRMR subtracts the average value of mutual information between features from mutual information between features and the class label, and selects the feature with the maximum difference. However, the problem is that the feature with the maximum difference is not always the feature with minimal redundancy maximal relevance. To solve the problem, the objective function of mRMR is first analyzed and a constraint condition that determines whether the objective function can guarantee the effectiveness of the selected features is achieved. Then, for the case where the objective function is not accurate, an idea of equal interval division is proposed and combined with ranking to process the interval of mutual information between features and the class label, and that of the average value of mutual information between features. Finally, a feature selection algorithm based on equal interval division and minimal-redundancy–maximal-relevance (EID–mRMR) is proposed. To validate the performance of EID–mRMR, we compare it with several incremental feature selection algorithms based on mutual information and other feature selection algorithms. Experimental results demonstrate that the EID–mRMR algorithm can achieve better feature selection performance.

[1]  Juan-Zi Li,et al.  A multi-objective evolutionary algorithm for feature selection based on mutual information with a new redundancy measure , 2015, Inf. Sci..

[2]  Ouen Pinngern,et al.  Feature subset selection wrapper based on mutual information and rough sets , 2012, Expert Syst. Appl..

[3]  Verónica Bolón-Canedo,et al.  Testing Different Ensemble Configurations for Feature Selection , 2017, Neural Processing Letters.

[4]  David G. Stork,et al.  Pattern Classification , 1973 .

[5]  Patrick P. K. Chan,et al.  Adversarial Feature Selection Against Evasion Attacks , 2016, IEEE Transactions on Cybernetics.

[6]  Ratna Babu Chinnam,et al.  mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification , 2011, Inf. Sci..

[7]  Jiawei Han,et al.  Feature selection using dynamic weights for classification , 2013, Knowl. Based Syst..

[8]  Ivor W. Tsang,et al.  Towards ultrahigh dimensional feature selection for big data , 2012, J. Mach. Learn. Res..

[9]  Qingshan Jiang,et al.  Feature selection via maximizing global information gain for text classification , 2013, Knowl. Based Syst..

[10]  Yu Wang,et al.  Choosing Between Two Classification Learning Algorithms Based on Calibrated Balanced $$5\times 2$$5×2 Cross-Validated F-Test , 2016, Neural Processing Letters.

[11]  Bor-Chen Kuo,et al.  Feature Mining for Hyperspectral Image Classification , 2013, Proceedings of the IEEE.

[12]  Rossitza Setchi,et al.  Feature selection using Joint Mutual Information Maximisation , 2015, Expert Syst. Appl..

[13]  Jianzhong Wang,et al.  Maximum weight and minimum redundancy: A novel framework for feature subset selection , 2013, Pattern Recognit..

[14]  David D. Lewis,et al.  Feature Selection and Feature Extraction for Text Categorization , 1992, HLT.

[15]  F. Fleuret Fast Binary Feature Selection with Conditional Mutual Information , 2004, J. Mach. Learn. Res..

[16]  Rui Zhang,et al.  A novel feature selection method considering feature interaction , 2015, Pattern Recognit..

[17]  Angel Cataron,et al.  Energy Supervised Relevance Neural Gas for Feature Ranking , 2010, Neural Processing Letters.

[18]  Ping Zhang,et al.  Class-specific mutual information variation for feature selection , 2018, Pattern Recognit..

[19]  C. Ding,et al.  Gene selection algorithm by combining reliefF and mRMR , 2007, 2007 IEEE 7th International Symposium on BioInformatics and BioEngineering.

[20]  Ping Zhang,et al.  Feature selection considering the composition of feature relevancy , 2018, Pattern Recognit. Lett..

[21]  Bo Tang,et al.  Toward Optimal Feature Selection in Naive Bayes for Text Categorization , 2016, IEEE Transactions on Knowledge and Data Engineering.

[22]  Ying Yin,et al.  Improving ELM-based microarray data classification by diversified sequence features selection , 2014, Neural Computing and Applications.

[23]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[24]  Gleb Gusev,et al.  Efficient High-Order Interaction-Aware Feature Selection Based on Conditional Mutual Information , 2016, NIPS.

[25]  Roberto Battiti,et al.  Using mutual information for selecting features in supervised neural net learning , 1994, IEEE Trans. Neural Networks.

[26]  Fuhui Long,et al.  Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[27]  Chong-Ho Choi,et al.  Input feature selection for classification problems , 2002, IEEE Trans. Neural Networks.

[28]  James Bailey,et al.  Effective global approaches for mutual information based feature selection , 2014, KDD.

[29]  Abdelhak M. Zoubir,et al.  Contributions to Automatic Target Recognition Systems for Underwater Mine Classification , 2015, IEEE Transactions on Geoscience and Remote Sensing.

[30]  Charles Elkan,et al.  Quadratic Programming Feature Selection , 2010, J. Mach. Learn. Res..

[31]  Yang Wang,et al.  Mutual information-based method for selecting informative feature sets , 2013, Pattern Recognit..

[32]  Usama M. Fayyad,et al.  Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning , 1993, IJCAI.

[33]  Verónica Bolón-Canedo,et al.  A review of feature selection methods on synthetic data , 2013, Knowledge and Information Systems.

[34]  Huan Liu,et al.  Advancing feature selection research , 2010 .

[35]  Pablo A. Estévez,et al.  A review of feature selection methods based on mutual information , 2013, Neural Computing and Applications.

[36]  Igor Kononenko,et al.  Estimating Attributes: Analysis and Extensions of RELIEF , 1994, ECML.

[37]  Jugal K. Kalita,et al.  MIFS-ND: A mutual information-based feature selection method , 2014, Expert Syst. Appl..

[38]  Jacek M. Zurada,et al.  Normalized Mutual Information Feature Selection , 2009, IEEE Transactions on Neural Networks.

[39]  Chuen-Horng Lin,et al.  Study of image retrieval and classification based on adaptive features using genetic algorithm feature selection , 2014, Expert Syst. Appl..

[40]  Abdelhak M. Zoubir,et al.  A hybrid relevance measure for feature selection and its application to underwater objects recognition , 2012, 2012 19th IEEE International Conference on Image Processing.

[41]  Gavin Brown,et al.  Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature Selection , 2012, J. Mach. Learn. Res..

[42]  Min Han,et al.  Global mutual information-based feature selection approach using single-objective and multi-objective optimization , 2015, Neurocomputing.

[43]  Jun Wang,et al.  Feature Selection by Maximizing Independent Classification Information , 2017, IEEE Transactions on Knowledge and Data Engineering.

[44]  James Bailey,et al.  Can high-order dependencies improve mutual information based feature selection? , 2016, Pattern Recognit..

[45]  Manpreet Kaur,et al.  An approach for feature selection using local searching and global optimization techniques , 2017, Neural Computing and Applications.