An investigation into the effects of label noise on Dynamic Selection algorithms

[1]  George D. C. Cavalcanti,et al.  Analyzing different prototype selection techniques for dynamic classifier and ensemble selection , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).

[2]  Leon N. Cooper,et al.  Improving nearest neighbor rule with a simple adaptive distance measure , 2007, Pattern Recognit. Lett..

[3]  Thomas G. Dietterich An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.

[4]  George D. C. Cavalcanti,et al.  DESlib: A Dynamic ensemble selection library in Python , 2018, J. Mach. Learn. Res..

[5]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[6]  George D. C. Cavalcanti,et al.  Prototype selection for dynamic classifier and ensemble selection , 2016, Neural Computing and Applications.

[7]  Francisco Herrera,et al.  A memetic algorithm for evolutionary prototype selection: A scaling up approach , 2008, Pattern Recognit..

[8]  David B. Skalak,et al.  Prototype and Feature Selection by Sampling and Random Mutation Hill Climbing Algorithms , 1994, ICML.

[9]  S. Holm A Simple Sequentially Rejective Multiple Test Procedure , 1979 .

[10]  Fernando Nogueira,et al.  Imbalanced-learn: A Python Toolbox to Tackle the Curse of Imbalanced Datasets in Machine Learning , 2016, J. Mach. Learn. Res..

[11]  Dennis L. Wilson,et al.  Asymptotic Properties of Nearest Neighbor Rules Using Edited Data , 1972, IEEE Trans. Syst. Man Cybern..

[12]  Raymond J. Mooney,et al.  Experiments on Ensembles with Missing and Noisy Data , 2004, Multiple Classifier Systems.

[13]  Francisco Herrera,et al.  Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Rocco A. Servedio,et al.  Random classification noise defeats all convex potential boosters , 2008, ICML '08.

[15]  George D. C. Cavalcanti,et al.  Online pruning of base classifiers for Dynamic Ensemble Selection , 2017, Pattern Recognit..

[16]  Lakhmi C. Jain,et al.  Nearest neighbor classifier: Simultaneous editing and feature selection , 1999, Pattern Recognit. Lett..

[17]  S. García,et al.  An Extension on "Statistical Comparisons of Classifiers over Multiple Data Sets" for all Pairwise Comparisons , 2008 .

[18]  Robert Sabourin,et al.  From dynamic classifier selection to dynamic ensemble selection , 2008, Pattern Recognit..

[19]  George D. C. Cavalcanti,et al.  A study on combining dynamic selection and data preprocessing for imbalance learning , 2018, Neurocomputing.

[20]  George D. C. Cavalcanti,et al.  FIRE-DES++: Enhanced Online Pruning of Base Classifiers for Dynamic Ensemble Selection , 2018, Pattern Recognit..

[21]  Kevin W. Bowyer,et al.  Combination of Multiple Classifiers Using Local Accuracy Estimates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  George D. C. Cavalcanti,et al.  An Ensemble Generation Method Based on Instance Hardness , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).

[23]  Basilio Sierra,et al.  K Nearest Neighbor Equality: Giving equal chance to all existing classes , 2011, Inf. Sci..

[24]  Francesca Mangili,et al.  Should We Really Use Post-Hoc Tests Based on Mean-Ranks? , 2015, J. Mach. Learn. Res..

[25]  George D. C. Cavalcanti,et al.  A method for dynamic ensemble selection based on a filter and an adaptive distance to improve the quality of the regions of competence , 2011, IJCNN.

[26]  Ludmila I. Kuncheva,et al.  Editing for the k-nearest neighbors rule by a genetic algorithm , 1995, Pattern Recognit. Lett..

[27]  Chris Mellish,et al.  Advances in Instance Selection for Instance-Based Learning Algorithms , 2002, Data Mining and Knowledge Discovery.

[28]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[29]  Filiberto Pla,et al.  Prototype selection for the nearest neighbour rule through proximity graphs , 1997, Pattern Recognit. Lett..

[30]  Larry J. Eshelman,et al.  The CHC Adaptive Search Algorithm: How to Have Safe Search When Engaging in Nontraditional Genetic Recombination , 1990, FOGA.

[31]  M. Verleysen,et al.  Classification in the Presence of Label Noise: A Survey , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[32]  Luiz Eduardo Soares de Oliveira,et al.  Dynamic selection of classifiers - A comprehensive review , 2014, Pattern Recognit..

[33]  George D. C. Cavalcanti,et al.  META-DES.Oracle: Meta-learning and feature selection for dynamic ensemble selection , 2017, Inf. Fusion.

[34]  F. Wilcoxon Individual Comparisons by Ranking Methods , 1945 .

[35]  George D. C. Cavalcanti,et al.  META-DES: A dynamic ensemble selection framework using meta-learning , 2015, Pattern Recognit..

[36]  On classifier domains of competence , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[37]  Andrés R. Masegosa,et al.  Bagging schemes on the presence of class noise in classification , 2012, Expert Syst. Appl..

[38]  Serafín Moral,et al.  Building classification trees using the total uncertainty criterion , 2003, Int. J. Intell. Syst..

[39]  Taghi M. Khoshgoftaar,et al.  Comparing Boosting and Bagging Techniques With Noisy and Imbalanced Data , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[40]  Marek Kurzynski,et al.  A probabilistic model of classifier competence for dynamic ensemble selection , 2011, Pattern Recognit..

[41]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.

[42]  Tony R. Martinez,et al.  Reduction Techniques for Instance-Based Learning Algorithms , 2000, Machine Learning.

[43]  D C CavalcantiGeorge,et al.  Dynamic classifier selection , 2018 .

[44]  Tony R. Martinez,et al.  An instance level analysis of data complexity , 2014, Machine Learning.