Joint leaf-refinement and ensemble pruning through L1 regularization
暂无分享,去创建一个
[1] Katharina Morik,et al. There is no Double-Descent in Random Forests , 2021, ArXiv.
[2] Mohsen Shahhosseini,et al. Improved Weighted Random Forest for Classification Problems , 2020, ArXiv.
[3] Paweł Zyblewski,et al. Novel clustering-based pruning algorithms , 2020, Pattern Analysis and Applications.
[4] Sarangapani Jagannathan,et al. A comprehensive survey on model compression and acceleration , 2020, Artificial Intelligence Review.
[5] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[6] Jorge Cabral,et al. Machine Learning in Resource-Scarce Embedded Systems, FPGAs, and End-Devices: A Survey , 2019, Electronics.
[7] Pawel Zyblewski,et al. Clustering-Based Ensemble Pruning and Multistage Organization Using Diversity , 2019, HAIS.
[8] Hieu Pham,et al. Optimizing Ensemble Weights and Hyperparameters of Machine Learning Models for Regression Problems , 2019, Machine Learning with Applications.
[9] C. Faloutsos,et al. Ensemble Methods , 2019, Machine Learning with Spark™ and Python®.
[10] Raffaele Perego,et al. X-CLE a VER: Learning Ranking Ensembles by Growing and Pruning Trees , 2018 .
[11] Katharina Morik,et al. Realization of Random Forest for Real-Time Evaluation through Tree Framing , 2018, 2018 IEEE International Conference on Data Mining (ICDM).
[12] Fabrizio Silvestri,et al. X-CLEaVER , 2018, ACM Trans. Intell. Syst. Technol..
[13] Mojtaba Masoudinejad,et al. Machine Learning Based Indoor Localisation Using Environmental Data in PhyNetLab Warehouse , 2018 .
[14] Mingliang Xu,et al. Margin & diversity based ordering ensemble pruning , 2018, Neurocomputing.
[15] Saurabh Goyal,et al. Resource-efficient Machine Learning in 2 KB RAM for the Internet of Things , 2017, ICML.
[16] Bin Fu,et al. Generalized Ambiguity Decompositions for Classification with Applications in Active Learning and Unsupervised Ensemble Pruning , 2017, AAAI.
[17] Hanan Samet,et al. Pruning Filters for Efficient ConvNets , 2016, ICLR.
[18] Thiago J. M. Moura,et al. Combining diversity measures for ensemble pruning , 2016, Pattern Recognit. Lett..
[19] Micha Elsner,et al. Feature Selection , 2014, Computer Vision, A Reference Guide.
[20] Erwan Scornet,et al. A random forest guided tour , 2015, TEST.
[21] Feiping Nie,et al. Robust Dictionary Learning with Capped l1-Norm , 2015, IJCAI.
[22] Jian Sun,et al. Global refinement of random forest , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[23] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[24] Mehryar Mohri,et al. Deep Boosting , 2014, ICML.
[25] Vikas Sindhwani,et al. Near-separable Non-negative Matrix Factorization with ℓ1 and Bregman Loss Functions , 2013, SDM.
[26] Sebastian Nowozin,et al. Decision Jungles: Compact and Rich Models for Classification , 2013, NIPS.
[27] Stephen P. Boyd,et al. Proximal Algorithms , 2013, Found. Trends Optim..
[28] N. D. Freitas,et al. Narrowing the Gap: Random Forests In Theory and In Practice , 2014, ICML.
[29] Gilles Louppe,et al. Ensembles on Random Patches , 2012, ECML/PKDD.
[30] Yang Yu,et al. Diversity Regularized Ensemble Pruning , 2012, ECML/PKDD.
[31] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[32] Xindong Wu,et al. Ensemble pruning via individual contribution ordering , 2010, KDD.
[33] Gérard Biau,et al. Analysis of a Random Forests Model , 2010, J. Mach. Learn. Res..
[34] Giorgio Valentini,et al. Applications of Supervised and Unsupervised Ensemble Methods , 2009, Applications of Supervised and Unsupervised Ensemble Methods.
[35] Daniel Hernández-Lobato,et al. An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[36] J. Demšar. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[37] William Nick Street,et al. Ensemble Pruning Via Semi-definite Programming , 2006, J. Mach. Learn. Res..
[38] Gonzalo Martínez-Muñoz,et al. Pruning in ordered bagging ensembles , 2006, ICML.
[39] Pierre Geurts,et al. Extremely randomized trees , 2006, Machine Learning.
[40] Peter Tiño,et al. Managing Diversity in Regression Ensembles , 2005, J. Mach. Learn. Res..
[41] V. Koltchinskii,et al. Empirical margin distributions and bounding the generalization error of combined classifiers , 2002, math/0405343.
[42] L. Breiman. Random Forests , 2001, Encyclopedia of Machine Learning and Data Mining.
[43] Zoran Obradovic,et al. Effective pruning of neural network classifier ensembles , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).
[44] Fabio Roli,et al. Design of effective multiple classifier systems by clustering of classifiers , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.
[45] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[46] Thomas G. Dietterich,et al. Pruning Adaptive Boosting , 1997, ICML.
[47] Mohammad Shoyaib,et al. Introducing Confidence as a Weight in Random Forest , 2019, 2019 International Conference on Robotics,Electrical and Signal Processing Techniques (ICREST).
[48] Katharina Morik,et al. Decision Tree and Random Forest Implementations for Fast Filtering of Sensor Data , 2018, IEEE Transactions on Circuits and Systems I: Regular Papers.
[49] Paolo Missier,et al. Data Integration in the Life Sciences , 2018, Lecture Notes in Computer Science.
[50] Y. Freund,et al. Boosting , 2012 .
[51] Grigorios Tsoumakas,et al. An Ensemble Pruning Primer , 2009, Applications of Supervised and Unsupervised Ensemble Methods.
[52] Claudio Conversano,et al. Decision Tree Induction , 2009, Encyclopedia of Data Warehousing and Mining.
[53] Alberto Suárez,et al. Aggregation Ordering in Bagging , 2004 .
[54] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[55] Azriel Rosenfeld,et al. Machine Learning and Data Mining in Pattern Recognition , 2000, Lecture Notes in Computer Science.
[56] L. Breiman. SOME INFINITY THEORY FOR PREDICTOR ENSEMBLES , 2000 .
[57] R. Tibshirani. Regression Shrinkage and Selection via the Lasso , 1996 .
[58] Dirk Van,et al. Ensemble Methods: Foundations and Algorithms , 2012 .