Mean Error Rate Weighted Online Boosting Method
暂无分享,去创建一个
[1] João Gama,et al. On evaluating stream learning algorithms , 2012, Machine Learning.
[2] Alberto Cano,et al. Kappa Updated Ensemble for drifting data stream mining , 2019, Machine Learning.
[3] Cynthia Rudin,et al. Online coordinate boosting , 2008, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.
[4] Roberto Souto Maior de Barros,et al. A Boosting-like Online Learning Ensemble , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).
[5] Janez Demsar,et al. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[6] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[7] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[8] Xin Yao,et al. Online Ensemble Learning of Data Streams with Gradually Evolved Classes , 2016, IEEE Transactions on Knowledge and Data Engineering.
[9] Paul H. J. Kelly,et al. Performance prediction of paging workloads using lightweight tracing , 2006, Future Gener. Comput. Syst..
[10] Der-Jiunn Deng,et al. Concept Drift Detection and Adaption in Big Imbalance Industrial IoT Data Using an Ensemble Learning Method of Offline Classifiers , 2019, IEEE Access.
[11] Ayhan Demiriz,et al. Linear Programming Boosting via Column Generation , 2002, Machine Learning.
[12] Geoff Holmes,et al. MOA: Massive Online Analysis , 2010, J. Mach. Learn. Res..
[13] Roberto Souto Maior de Barros,et al. Online AdaBoost-based methods for multiclass problems , 2019, Artificial Intelligence Review.
[14] Geoff Holmes,et al. New ensemble methods for evolving data streams , 2009, KDD.
[15] Manfred K. Warmuth,et al. The Weighted Majority Algorithm , 1994, Inf. Comput..
[16] Horst Bischof,et al. On robustness of on-line boosting - a competitive study , 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.
[17] Marcus A. Maloof,et al. Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts , 2007, J. Mach. Learn. Res..
[18] Shankar Vembu,et al. Chemical gas sensor drift compensation using classifier ensembles , 2012 .
[19] Roberto Souto Maior de Barros,et al. An overview and comprehensive comparison of ensembles for concept drift , 2019, Inf. Fusion.
[20] Joanna Jedrzejowicz,et al. Gene Expression Programming Classifier with Concept Drift Detection Based on Fisher Exact Test , 2019, KES-IDT.
[21] Kian-Lee Tan,et al. Fast hierarchical clustering and its validation , 2003, Data Knowl. Eng..
[22] Yoram Singer,et al. Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.
[23] Xin Yao,et al. DDD: A New Ensemble Approach for Dealing with Concept Drift , 2012, IEEE Transactions on Knowledge and Data Engineering.
[24] Leonardo Trujillo,et al. Random Tree Generator for an FPGA-based Genetic Programming System , 2016, GECCO.
[25] Vincenzo Loia,et al. Drift-Aware Methodology for Anomaly Detection in Smart Grid , 2019, IEEE Access.
[26] João Gama,et al. Learning with Drift Detection , 2004, SBIA.
[27] Wei-Yin Loh,et al. Classification and regression trees , 2011, WIREs Data Mining Knowl. Discov..
[28] Roberto Souto Maior de Barros,et al. Speeding Up Recovery from Concept Drifts , 2014, ECML/PKDD.
[29] Yoav Freund,et al. An Adaptive Version of the Boost by Majority Algorithm , 1999, COLT '99.
[30] Nathalie Japkowicz,et al. Adaptive learning on mobile network traffic data , 2018, Connect. Sci..
[31] Yoav Freund,et al. Boosting a weak learning algorithm by majority , 1995, COLT '90.
[32] João Gama,et al. A new dynamic modeling framework for credit risk assessment , 2016, Expert Syst. Appl..
[33] José del Campo-Ávila,et al. Online and Non-Parametric Drift Detection Methods Based on Hoeffding’s Bounds , 2015, IEEE Transactions on Knowledge and Data Engineering.