Evaluating the Dynamicity of Feature and Individual Classifiers Selection in Ensembles of Classifiers

A feature selection method has the objective of selecting the best feature subset that represents the entire dataset. The majority of these methods apply a static selection procedure, since it selects a feature subset and uses it throughout the classification. Recently, dynamic feature selection has been emerged as an efficient alternative for feature selection. Instead of selecting the feature subset for the entire dataset, a dynamic method selects the best feature subset for an individual instance or a group of instances and, in this sense, each instance or group will have its own feature subset. The use of feature selection methods helps to improve the accuracy in classification tasks, using either single classifiers or ensemble of classifiers. In the context of ensembles, a feature selection method selects the best feature subset for each individual classifier to be used in an ensemble. In this paper, we propose an investigation of integrating dynamic feature selection (DFS) in ensemble of classifiers. More specifically, the use of DFS methods in dynamic ensemble methods, in which an ensemble structure (individual classifiers) is selected for each testing instance. Our main objective is to promote dynamicity in ensembles of classifiers in order to obtain more robust ensembles. In order to accomplish this investigation, two well known dynamic ensemble methods are chosen to be analyzed. Our findings indicated real benefits when integrating the dynamic feature selection method with the dynamic ensemble selection methods, for the majority of cases.

[1]  Anne M. P. Canuto,et al.  An exploratory study of mono and multi-objective metaheuristics to ensemble of classifiers , 2018, Applied Intelligence.

[2]  George D. C. Cavalcanti,et al.  Dynamic classifier selection: Recent advances and perspectives , 2018, Inf. Fusion.

[3]  Anne M. P. Canuto,et al.  Multiobjective Optimization Techniques for Selecting Important Metrics in the Design of Ensemble Systems , 2017, Comput. Intell..

[4]  Robert Sabourin,et al.  From dynamic classifier selection to dynamic ensemble selection , 2008, Pattern Recognit..

[5]  Anne M. P. Canuto,et al.  An unsupervised-based dynamic feature selection for classification tasks , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[6]  Eric R. Ziegel,et al.  Mastering Data Mining , 2001, Technometrics.

[7]  J. H. Zar,et al.  Significance Testing of the Spearman Rank Correlation Coefficient , 1972 .

[8]  Paul C. Smits,et al.  Multiple classifier systems for supervised remote sensing image classification based on dynamic classifier selection , 2002, IEEE Trans. Geosci. Remote. Sens..

[9]  George D. C. Cavalcanti,et al.  Online pruning of base classifiers for Dynamic Ensemble Selection , 2017, Pattern Recognit..

[10]  George D. C. Cavalcanti,et al.  META-DES.Oracle: Meta-learning and feature selection for dynamic ensemble selection , 2017, Inf. Fusion.

[11]  Mohammed J. Zaki Data Mining and Analysis: Fundamental Concepts and Algorithms , 2014 .

[12]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[13]  George D. C. Cavalcanti,et al.  META-DES: A dynamic ensemble selection framework using meta-learning , 2015, Pattern Recognit..

[14]  Robert Sabourin,et al.  Dynamic selection approaches for multiple classifier systems , 2011, Neural Computing and Applications.

[15]  Steven T. Garren,et al.  Maximum likelihood estimation of the correlation coefficient in a bivariate normal model with missing data , 1998 .

[16]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[17]  Adel M. Alimi,et al.  MOPSO for dynamic feature selection problem based big data fusion , 2016, 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[18]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.

[19]  Lakhmi C. Jain,et al.  Feature Selection for Data and Pattern Recognition , 2014, Feature Selection for Data and Pattern Recognition.

[20]  Valentina Colla,et al.  Improving the stability of wrapper variable selection applied to binary classification , 2016, CISIM 2016.

[21]  Rajen Dinesh Shah,et al.  Variable selection with error control: another look at stability selection , 2011, 1105.5578.