Dynamic ensemble selection based on hesitant fuzzy multiple criteria decision making

Among several extensions of fuzzy sets, hesitant fuzzy sets (HFSs) are interesting and practical. This paper proposes an application of HFSs in multiple classifier systems (MCSs). The MCSs have been proven as an effective and robust strategy for classification problems. These systems combine different classifiers and generally are composed of three steps: generation, selection (optional) and integration. This paper focuses on the selection step and proposes a novel dynamic ensemble selection method. In particular, the proposed method employs some selection criteria to determine the range of competency of the classifiers, and then, a HMCDM (hesitant fuzzy multiple criteria decision making) method is utilized to select the appropriate classifiers. Experimental results show that the proposed framework improves classification accuracy when compared against current state-of-the-art dynamic ensemble selection techniques. The Quade nonparametric statistical test confirms the capability of our proposed method.

[1]  Enrique Herrera-Viedma,et al.  A hybrid recommender system for the selective dissemination of research resources in a Technology Transfer Office , 2012, Inf. Sci..

[2]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[3]  Francisco Herrera,et al.  Hesitant Fuzzy Sets: State of the Art and Future Directions , 2014, Int. J. Intell. Syst..

[4]  Mahdi Eftekhari,et al.  A Hybrid Filter-Based Feature Selection Method via Hesitant Fuzzy and Rough Sets Concepts , 2018, How Fuzzy Concepts Contribute to Machine Learning.

[5]  Alceu de Souza Britto,et al.  Music genre classification using dynamic selection of ensemble of classifiers , 2012, 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[6]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.

[7]  Emilio Corchado,et al.  A survey of multiple classifier systems as hybrid systems , 2014, Inf. Fusion.

[8]  Gian Luca Marcialis,et al.  A study on the performances of dynamic classifier selection based on local accuracy estimation , 2005, Pattern Recognit..

[9]  V. Torra,et al.  A framework for linguistic logic programming , 2010 .

[10]  Marek Kurzynski,et al.  A probabilistic model of classifier competence for dynamic ensemble selection , 2011, Pattern Recognit..

[11]  Fabio Roli,et al.  Fusion of multiple classifiers for intrusion detection in computer networks , 2003, Pattern Recognit. Lett..

[12]  Marek Kurzynski,et al.  A measure of competence based on random classification for dynamic ensemble selection , 2012, Inf. Fusion.

[13]  George D. C. Cavalcanti,et al.  FIRE-DES++: Enhanced Online Pruning of Base Classifiers for Dynamic Ensemble Selection , 2018, Pattern Recognit..

[14]  Yu Wang,et al.  Ensemble classification based on supervised clustering for credit scoring , 2016, Appl. Soft Comput..

[15]  George D. C. Cavalcanti,et al.  Dynamic classifier selection: Recent advances and perspectives , 2018, Inf. Fusion.

[16]  B. Farhadinia,et al.  A series of score functions for hesitant fuzzy sets , 2014, Inf. Sci..

[17]  Robert Sabourin,et al.  Dynamic selection approaches for multiple classifier systems , 2011, Neural Computing and Applications.

[18]  Vasant Honavar,et al.  Learn++: an incremental learning algorithm for supervised neural networks , 2001, IEEE Trans. Syst. Man Cybern. Part C.

[19]  Zeshui Xu,et al.  Hesitant fuzzy information aggregation in decision making , 2011, Int. J. Approx. Reason..

[20]  J. Suykens,et al.  Benchmarking state-of-the-art classification algorithms for credit scoring: An update of research , 2015, Eur. J. Oper. Res..

[21]  D C CavalcantiGeorge,et al.  Dynamic classifier selection , 2018 .

[22]  Fabio Roli,et al.  Intrusion detection in computer networks by a modular ensemble of one-class classifiers , 2008, Inf. Fusion.

[23]  Jing Li,et al.  A distance-based weighting framework for boosting the performance of dynamic ensemble selection , 2019, Inf. Process. Manag..

[24]  Robert Sabourin,et al.  LoGID: An adaptive framework combining local and global incremental learning for dynamic selection of ensembles of HMMs , 2012, Pattern Recognit..

[25]  Andrea De Lucia,et al.  Dynamic Selection of Classifiers in Bug Prediction: An Adaptive Method , 2017, IEEE Transactions on Emerging Topics in Computational Intelligence.

[26]  L MinkuLeandro,et al.  Ensemble learning for data stream analysis , 2017 .

[27]  María José del Jesús,et al.  KEEL: a software tool to assess evolutionary algorithms for data mining problems , 2008, Soft Comput..

[28]  Nojun Kwak,et al.  Feature extraction for classification problems and its application to face recognition , 2008, Pattern Recognit..

[29]  Huayou Chen,et al.  Note on "Hesitant fuzzy prioritized operators and their application to multiple attribute decision making" , 2016, Knowl. Based Syst..

[30]  Robert Sabourin,et al.  Dynamic ensembles of exemplar-SVMs for still-to-video face recognition , 2017, Pattern Recognit..

[31]  Robert Sabourin,et al.  From dynamic classifier selection to dynamic ensemble selection , 2008, Pattern Recognit..

[32]  Mohammad Kazem Ebrahimpour,et al.  Ensemble of feature selection methods: A hesitant fuzzy sets approach , 2017, Appl. Soft Comput..

[33]  Andrea De Lucia,et al.  Cross-project defect prediction models: L'Union fait la force , 2014, 2014 Software Evolution Week - IEEE Conference on Software Maintenance, Reengineering, and Reverse Engineering (CSMR-WCRE).

[34]  Marek Kurzynski,et al.  On a New Competence Measure Applied to the Dynamic Selection of Classifiers Ensemble , 2017, DS.

[35]  Fabio Roli,et al.  Dynamic classifier selection based on multiple classifier behaviour , 2001, Pattern Recognit..

[36]  Juan José Rodríguez Diez,et al.  Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  Fatemeh Afsari,et al.  Hesitant fuzzy decision tree approach for highly imbalanced data classification , 2017, Appl. Soft Comput..

[38]  Francisco Herrera,et al.  A Review on Ensembles for the Class Imbalance Problem: Bagging-, Boosting-, and Hybrid-Based Approaches , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[39]  Mohammad Kazem Ebrahimpour,et al.  Distributed feature selection: A hesitant fuzzy correlation concept for microarray high-dimensional datasets , 2018 .

[40]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[41]  Biao Wang,et al.  Outlier detection based on Gaussian process with application to industrial processes , 2019, Appl. Soft Comput..

[42]  Robert P. W. Duin,et al.  Bagging for linear classifiers , 1998, Pattern Recognit..

[43]  Ludmila I. Kuncheva,et al.  Classifier Ensembles for Changing Environments , 2004, Multiple Classifier Systems.

[44]  Robert A. Legenstein,et al.  Combining predictions for accurate recommender systems , 2010, KDD.

[45]  Nicolas Werro,et al.  Fuzzy Classification of Online Customers , 2015 .

[46]  George D. C. Cavalcanti,et al.  META-DES: A dynamic ensemble selection framework using meta-learning , 2015, Pattern Recognit..

[47]  Zhi-Hua Zhou,et al.  Boosting and margin theory , 2012 .

[48]  Esfandiar Eslami,et al.  A Definition for Hesitant fuzzy Partitions , 2016, Int. J. Comput. Intell. Syst..

[49]  Luiz Eduardo Soares de Oliveira,et al.  Dynamic selection of classifiers - A comprehensive review , 2014, Pattern Recognit..

[50]  EbrahimpourMohammad Kazem,et al.  Ensemble of feature selection methods , 2017 .

[51]  João Gama,et al.  Ensemble learning for data stream analysis: A survey , 2017, Inf. Fusion.