Learning from evolving data streams through ensembles of random patches
暂无分享,去创建一个
Heitor Murilo Gomes | Jesse Read | Albert Bifet | Robert J. Durrant | A. Bifet | J. Read | R. Durrant
[1] Albert Bifet,et al. On Ensemble Techniques for Data Stream Regression , 2020, 2020 International Joint Conference on Neural Networks (IJCNN).
[2] Bob Durrant,et al. A Diversity-aware Model for Majority Vote Ensemble Accuracy , 2020, AISTATS.
[3] João Gama,et al. Machine learning for streaming data: state of the art, challenges, and opportunities , 2019, SKDD.
[4] Heitor Murilo Gomes,et al. Streaming Random Patches for Evolving Data Stream Classification , 2019, 2019 IEEE International Conference on Data Mining (ICDM).
[5] Jean Paul Barddal,et al. Adaptive random forests for data stream regression , 2018, ESANN.
[6] Talel Abdessalem,et al. Adaptive random forests for evolving data stream classification , 2017, Machine Learning.
[7] R. Durrant,et al. Linear dimensionality reduction in linear time: Johnson-Lindenstrauss-type guarantees for random subspace , 2017, 1705.06408.
[8] Jean Paul Barddal,et al. A Survey on Ensemble Learning for Data Stream Classification , 2017, ACM Comput. Surv..
[9] Geoffrey I. Webb,et al. Characterizing concept drift , 2015, Data Mining and Knowledge Discovery.
[10] Jerzy Stefanowski,et al. Combining block-based and online methods in learning ensembles from concept drifting data streams , 2014, Inf. Sci..
[11] A. Bifet,et al. A survey on concept drift adaptation , 2014, ACM Comput. Surv..
[12] Wu He,et al. Internet of Things in Industries: A Survey , 2014, IEEE Transactions on Industrial Informatics.
[13] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[14] Richard John Stapenhurst. Diversity, margins and non-stationary learning , 2012 .
[15] Gilles Louppe,et al. Ensembles on Random Patches , 2012, ECML/PKDD.
[16] Hsuan-Tien Lin,et al. An Online Boosting Algorithm with Theoretical Justifications , 2012, ICML.
[17] Ludmila I. Kuncheva,et al. Naive random subspace ensemble with linear classifiers for real-time classification of fMRI data , 2012, Pattern Recognit..
[18] Geoff Holmes,et al. Ensembles of Restricted Hoeffding Trees , 2012, TIST.
[19] Nitesh V. Chawla,et al. Heuristic Updatable Weighted Random Subspaces for Non-stationary Environments , 2011, 2011 IEEE 11th International Conference on Data Mining.
[20] Saso Dzeroski,et al. Learning model trees from evolving data streams , 2010, Data Mining and Knowledge Discovery.
[21] Geoff Holmes,et al. Leveraging Bagging for Evolving Data Streams , 2010, ECML/PKDD.
[22] Xin Yao,et al. The Impact of Diversity on Online Ensemble Learning in the Presence of Concept Drift , 2010, IEEE Transactions on Knowledge and Data Engineering.
[23] Geoff Holmes,et al. MOA: Massive Online Analysis , 2010, J. Mach. Learn. Res..
[24] Juan José Rodríguez Diez,et al. Random Subspace Ensembles for fMRI Classification , 2010, IEEE Transactions on Medical Imaging.
[25] David B. Skillicorn,et al. Classifying Evolving Data Streams Using Dynamic Streaming Random Forests , 2008, DEXA.
[26] Marcus A. Maloof,et al. Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts , 2007, J. Mach. Learn. Res..
[27] Saso Dzeroski,et al. Combining Bagging and Random Subspaces to Create Better Ensembles , 2007, IDA.
[28] Ricard Gavaldà,et al. Learning from Time-Changing Data with Adaptive Windowing , 2007, SDM.
[29] Janez Demsar,et al. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[30] Yi Lin,et al. Random Forests and Adaptive Nearest Neighbors , 2006 .
[31] Geoff Holmes,et al. Stress-Testing Hoeffding Trees , 2005, PKDD.
[32] Xin Yao,et al. Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.
[33] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[34] Gerhard Widmer,et al. Learning in the Presence of Concept Drift and Hidden Contexts , 1996, Machine Learning.
[35] Stuart J. Russell,et al. Online bagging and boosting , 2005, 2005 IEEE International Conference on Systems, Man and Cybernetics.
[36] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[37] Leo Breiman,et al. Pasting Small Votes for Classification in Large Databases and On-Line , 1999, Machine Learning.
[38] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[39] Ludmila I. Kuncheva,et al. That Elusive Diversity in Classifier Ensembles , 2003, IbPRIA.
[40] Partha Niyogi,et al. Almost-everywhere Algorithmic Stability and Generalization Error , 2002, UAI.
[41] André Elisseeff,et al. Stability and Generalization , 2002, J. Mach. Learn. Res..
[42] Rocco A. Servedio,et al. Smooth Boosting and Learning with Malicious Noise , 2001, J. Mach. Learn. Res..
[43] Geoff Hulten,et al. Mining high-speed data streams , 2000, KDD '00.
[44] Pedro M. Domingos. A Unified Bias-Variance Decomposition for Zero-One and Squared Loss , 2000, AAAI/IAAI.
[45] Xin Yao,et al. Ensemble learning via negative correlation , 1999, Neural Networks.
[46] Trevor J. M. Bench-Capon,et al. Proceedings of the 10th International Conference on Database and Expert Systems Applications , 1999 .
[47] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[48] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[49] Manfred K. Warmuth,et al. The weighted majority algorithm , 1989, 30th Annual Symposium on Foundations of Computer Science.