A reliable ensemble based approach to semi-supervised learning
暂无分享,去创建一个
[1] Thomas G. Dietterich,et al. Improved Class Probability Estimates from Decision Tree Models , 2003 .
[2] Alexander Zien,et al. Semi-Supervised Learning , 2006 .
[3] Ludmila I. Kuncheva. Diversity in multiple classifier systems , 2005, Inf. Fusion.
[4] Ioannis E. Livieris. A New Ensemble Self-labeled Semi-supervised Algorithm , 2019, Informatica.
[5] Nizar Grira,et al. Unsupervised and Semi-supervised Clustering : a Brief Survey ∗ , 2004 .
[6] Janez Demsar,et al. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[7] David H. Wolpert,et al. An Efficient Method To Estimate Bagging's Generalization Error , 1999, Machine Learning.
[8] Francisco Herrera,et al. Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power , 2010, Inf. Sci..
[9] Michelangelo Ceci,et al. Semi-supervised classification trees , 2017, Journal of Intelligent Information Systems.
[10] Harry Zhang,et al. An Extensive Empirical Study on Semi-supervised Learning , 2010, 2010 IEEE International Conference on Data Mining.
[11] Bin Wang,et al. Semi-supervised Self-training for Sentence Subjectivity Classification , 2008, Canadian Conference on AI.
[12] Nong Sang,et al. Using clustering analysis to improve semi-supervised classification , 2013, Neurocomputing.
[13] Núria Macià,et al. Towards UCI+: A mindful repository design , 2014, Inf. Sci..
[14] Xiaojin Zhu,et al. Introduction to Semi-Supervised Learning , 2009, Synthesis Lectures on Artificial Intelligence and Machine Learning.
[15] David Yarowsky,et al. Unsupervised Word Sense Disambiguation Rivaling Supervised Methods , 1995, ACL.
[16] Robert D. Nowak,et al. Unlabeled data: Now it helps, now it doesn't , 2008, NIPS.
[17] Jesús Alcalá-Fdez,et al. KEEL Data-Mining Software Tool: Data Set Repository, Integration of Algorithms and Experimental Analysis Framework , 2011, J. Multiple Valued Log. Soft Comput..
[18] Yide Wang,et al. Progressive Semisupervised Learning of Multiple Classifiers , 2018, IEEE Transactions on Cybernetics.
[19] Paulo Cortez,et al. Modeling wine preferences by data mining from physicochemical properties , 2009, Decis. Support Syst..
[20] Georgios Kostopoulos,et al. Semi-supervised regression: A recent review , 2018, J. Intell. Fuzzy Syst..
[21] Mikhail Belkin,et al. Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..
[22] J. Friedman. Greedy function approximation: A gradient boosting machine. , 2001 .
[23] Zhi-Hua Zhou,et al. Tri-training: exploiting unlabeled data using three classifiers , 2005, IEEE Transactions on Knowledge and Data Engineering.
[24] J. Friedman. Stochastic gradient boosting , 2002 .
[25] Zhi-Hua Zhou. When semi-supervised learning meets ensemble learning , 2011 .
[26] Horst Bischof,et al. Semi-Supervised Random Forests , 2009, 2009 IEEE 12th International Conference on Computer Vision.
[27] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[28] Ayhan Demiriz,et al. Semi-Supervised Support Vector Machines , 1998, NIPS.
[29] J. Hanley,et al. The meaning and use of the area under a receiver operating characteristic (ROC) curve. , 1982, Radiology.
[30] Zhiwen Yu,et al. A survey on ensemble learning , 2019, Frontiers of Computer Science.
[31] Junnan Li,et al. An effective framework based on local cores for self-labeled semi-supervised classification , 2020, Knowl. Based Syst..
[32] Robert E. Schapire,et al. The strength of weak learnability , 1990, Mach. Learn..
[33] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[34] Ye Zhang,et al. Hyperspectral Image Classification Based on Semi-Supervised Rotation Forest , 2017, Remote. Sens..
[35] Lars Kai Hansen,et al. Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..
[36] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[37] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[38] Zhi-Hua Zhou,et al. Exploiting unlabeled data to enhance ensemble diversity , 2009, 2010 IEEE International Conference on Data Mining.
[39] Alessandro Laio,et al. Clustering by fast search and find of density peaks , 2014, Science.
[40] Hamideh Afsarmanesh,et al. Semi-supervised self-training for decision tree classifiers , 2017, Int. J. Mach. Learn. Cybern..
[41] Pedro M. Domingos,et al. Tree Induction for Probability-Based Ranking , 2003, Machine Learning.
[42] Guoyin Wang,et al. Self-training semi-supervised classification based on density peaks of data , 2018, Neurocomputing.
[43] Juan José Rodríguez Diez,et al. Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[44] Sebastian Thrun,et al. Text Classification from Labeled and Unlabeled Documents using EM , 2000, Machine Learning.
[45] Zhongsheng Hua,et al. Semi-supervised learning based on nearest neighbor rule and cut edges , 2010, Knowl. Based Syst..
[46] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[47] David Mease,et al. Boosted Classification Trees and Class Probability/Quantile Estimation , 2007, J. Mach. Learn. Res..
[48] Francisco Herrera,et al. Self-labeled techniques for semi-supervised learning: taxonomy, software and empirical study , 2015, Knowledge and Information Systems.