The deterministic subspace method for constructing classifier ensembles
暂无分享,去创建一个
Bartosz Krawczyk | Michal Wozniak | Michal Koziarski | B. Krawczyk | Michal Wozniak | Michał Koziarski
[1] Gavin Brown,et al. Learn++.MF: A random subspace approach for the missing feature problem , 2010, Pattern Recognit..
[2] Xin Yao,et al. Ieee Transactions on Knowledge and Data Engineering 1 Relationships between Diversity of Classification Ensembles and Single-class Performance Measures , 2022 .
[3] Yaxin Bi. The impact of diversity on the accuracy of evidential classifier ensembles , 2012, Int. J. Approx. Reason..
[4] Ethem Alpaydın,et al. Combined 5 x 2 cv F Test for Comparing Supervised Classification Learning Algorithms , 1999, Neural Comput..
[5] Peijun Du,et al. Random Subspace Ensembles for Hyperspectral Image Classification With Extended Morphological Attribute Profiles , 2015, IEEE Transactions on Geoscience and Remote Sensing.
[6] Haisheng Li,et al. Random subspace evidence classifier , 2013, Neurocomputing.
[7] Nicolás García-Pedrajas,et al. Random feature weights for decision tree ensemble construction , 2012, Inf. Fusion.
[8] Francisco Herrera,et al. DRCW-OVO: Distance-based relative competence weighting combination for One-vs-One strategy in multi-class problems , 2015, Pattern Recognit..
[9] Bartosz Krawczyk,et al. Untrained weighted classifier combination with embedded ensemble pruning , 2016, Neurocomputing.
[10] Jing Xue,et al. Face recognition based on random subspace method and tensor subspace analysis , 2017, Neural Computing and Applications.
[11] Xin Yao,et al. Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.
[12] Fabio Roli,et al. Diversity in Classifier Ensembles: Fertile Concept or Dead End? , 2013, MCS.
[13] Ghyslain Gagnon M A Carbonneau E Granger,et al. Robust multiple-instance learning ensembles using random subspace instance selection , 2016 .
[14] Zehra Cataltepe,et al. Co-training with relevant random subspaces , 2010, Neurocomputing.
[15] Sebastián Ventura,et al. Extremely high-dimensional optimization with MapReduce: Scaling functions and algorithm , 2017, Inf. Sci..
[16] Isabelle Guyon,et al. An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..
[17] Jacek Tabor,et al. Maximum Entropy Linear Manifold for Learning Discriminative Low-Dimensional Representation , 2015, ECML/PKDD.
[18] Jane You,et al. Semi-supervised classification based on random subspace dimensionality reduction , 2012, Pattern Recognit..
[19] Terry Windeatt,et al. Pruning of Error Correcting Output Codes by optimization of accuracy–diversity trade off , 2014, Machine Learning.
[20] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[21] Fang Liu,et al. Selective multiple kernel learning for classification with ensemble strategy , 2013, Pattern Recognit..
[22] Lior Rokach,et al. Decision forest: Twenty years of research , 2016, Inf. Fusion.
[23] Zongkai Yang,et al. Semi-random subspace method for writeprint identification , 2013, Neurocomputing.
[24] Vojislav Kecman,et al. Multi-target support vector regression via correlation regressor chains , 2017, Inf. Sci..
[25] Nikhil R. Pal,et al. A Multiobjective Genetic Programming-Based Ensemble for Simultaneous Feature Selection and Classification , 2016, IEEE Transactions on Cybernetics.
[26] Gian Luca Marcialis,et al. Fusion of Face Recognition Algorithms for Video-Based Surveillance Systems , 2003 .
[27] Terry Windeatt,et al. Accuracy/Diversity and Ensemble MLP Classifier Design , 2006, IEEE Transactions on Neural Networks.
[28] Anne M. P. Canuto,et al. ReinSel: A class-based mechanism for feature selection in ensemble of classifiers , 2012, Appl. Soft Comput..
[29] Bartosz Krawczyk,et al. Forming classifier ensembles with deterministic feature subspaces , 2016, 2016 Federated Conference on Computer Science and Information Systems (FedCSIS).
[30] Lawrence O. Hall,et al. Ensemble diversity measures and their application to thinning , 2004, Inf. Fusion.
[31] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[32] Marek Kurzynski,et al. A dynamic model of classifier competence based on the local fuzzy confusion matrix and the random reference classifier , 2016, Int. J. Appl. Math. Comput. Sci..
[33] Piotr Porwik,et al. Feature projection k-NN classifier model for imbalanced and incomplete medical data , 2016 .
[34] Ludmila I. Kuncheva,et al. Naive random subspace ensemble with linear classifiers for real-time classification of fMRI data , 2012, Pattern Recognit..
[35] Proceedings of the 2016 Federated Conference on Computer Science and Information Systems, FedCSIS 2016, Gdańsk, Poland, September 11-14, 2016 , 2016, FedCSIS.
[36] Boguslaw Cyganek. One-Class Support Vector Ensembles for Image Segmentation and Classification , 2011, Journal of Mathematical Imaging and Vision.
[37] David West,et al. Neural network ensemble strategies for financial decision applications , 2005, Comput. Oper. Res..
[38] IGOR T. PODOLAK,et al. THEORETICAL FOUNDATIONS AND EXPERIMENTAL RESULTS FOR A HIERARCHICAL CLASSIFIER WITH OVERLAPPING CLUSTERS , 2013, Comput. Intell..
[39] Erdem Bilgili,et al. Random subspace method with class separability weighting , 2016, Expert Syst. J. Knowl. Eng..
[40] Francisco Herrera,et al. A First Study on the Use of Boosting for Class Noise Reparation , 2016, HAIS.
[41] Robert P. W. Duin,et al. Bagging, Boosting and the Random Subspace Method for Linear Classifiers , 2002, Pattern Analysis & Applications.
[42] Emilio Corchado,et al. A survey of multiple classifier systems as hybrid systems , 2014, Inf. Fusion.