Class Imbalance Learning
暂无分享,去创建一个
[1] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[2] Naonori Ueda,et al. Generalization error of ensemble estimators , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).
[3] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[4] R. Polikar,et al. Ensemble based systems in decision making , 2006, IEEE Circuits and Systems Magazine.
[5] Kai Ming Ting,et al. A Comparative Study of Cost-Sensitive Boosting Algorithms , 2000, ICML.
[6] Nathalie Japkowicz,et al. A Novelty Detection Approach to Classification , 1995, IJCAI.
[7] Ludmila I. Kuncheva,et al. Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.
[8] Rosa Maria Valdovinos,et al. Class-dependant resampling for medical applications , 2005, Fourth International Conference on Machine Learning and Applications (ICMLA'05).
[9] Xingquan Zhu,et al. Lazy Bagging for Classifying Imbalanced Data , 2007, Seventh IEEE International Conference on Data Mining (ICDM 2007).
[10] Thomas G. Dietterich. Ensemble Methods in Machine Learning , 2000, Multiple Classifier Systems.
[11] Zhi-Hua Zhou,et al. Ieee Transactions on Knowledge and Data Engineering 1 Training Cost-sensitive Neural Networks with Methods Addressing the Class Imbalance Problem , 2022 .
[12] Robert C. Holte,et al. C4.5, Class Imbalance, and Cost Sensitivity: Why Under-Sampling beats Over-Sampling , 2003 .
[13] Vipin Kumar,et al. Evaluating boosting algorithms to classify rare classes: comparison and improvements , 2001, Proceedings 2001 IEEE International Conference on Data Mining.
[14] Robert P. W. Duin,et al. Limits on the majority vote accuracy in classifier fusion , 2003, Pattern Analysis & Applications.
[15] Thomas G. Dietterich. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.
[16] Cen Li,et al. Classifying imbalanced data using a bagging ensemble variation (BEV) , 2007, ACM-SE 45.
[17] Kagan Tumer,et al. Linear and Order Statistics Combiners for Pattern Classification , 1999, ArXiv.
[18] Francisco Herrera,et al. A Proposal of Evolutionary Prototype Selection for Class Imbalance Problems , 2006, IDEAL.
[19] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[20] José Salvador Sánchez,et al. Strategies for learning in class imbalance problems , 2003, Pattern Recognit..
[21] Xin Yao,et al. Ensemble learning via negative correlation , 1999, Neural Networks.
[22] Gustavo E. A. P. A. Batista,et al. Class Imbalances versus Class Overlapping: An Analysis of a Learning System Behavior , 2004, MICAI.
[23] Xin Yao,et al. Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.
[24] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[25] Nitesh V. Chawla,et al. C4.5 and Imbalanced Data sets: Investigating the eect of sampling method, probabilistic estimate, and decision tree structure , 2003 .
[26] Gustavo E. A. P. A. Batista,et al. A study of the behavior of several methods for balancing machine learning training data , 2004, SKDD.
[27] Xin Yao,et al. Simultaneous training of negatively correlated neural networks in an ensemble , 1999, IEEE Trans. Syst. Man Cybern. Part B.
[28] Giorgio Valentini,et al. Ensembles of Learning Machines , 2002, WIRN.
[29] Xin Yao,et al. Evolutionary ensembles with negative correlation learning , 2000, IEEE Trans. Evol. Comput..
[30] Nikunj C. Oza,et al. Online Ensemble Learning , 2000, AAAI/IAAI.
[31] Nitesh V. Chawla,et al. Editorial: special issue on learning from imbalanced data sets , 2004, SKDD.
[32] Sungzoon Cho,et al. Observational Learning Algorithm for an Ensemble of Neural Networks , 2002, Pattern Analysis & Applications.
[33] Francisco Herrera,et al. Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study , 2003, IEEE Trans. Evol. Comput..
[34] Rosa Maria Valdovinos,et al. The Imbalanced Training Sample Problem: Under or over Sampling? , 2004, SSPR/SPR.
[35] Charles Elkan,et al. The Foundations of Cost-Sensitive Learning , 2001, IJCAI.
[36] Zhi-Hua Zhou,et al. Exploratory Under-Sampling for Class-Imbalance Learning , 2006, Sixth International Conference on Data Mining (ICDM'06).
[37] Xin Yao,et al. An analysis of diversity measures , 2006, Machine Learning.
[38] Nitesh V. Chawla,et al. SMOTE: Synthetic Minority Over-sampling Technique , 2002, J. Artif. Intell. Res..
[39] Anders Krogh,et al. Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.
[40] Taghi M. Khoshgoftaar,et al. Using evolutionary sampling to mine imbalanced data , 2007, Sixth International Conference on Machine Learning and Applications (ICMLA 2007).
[41] Dimitris Kanellopoulos,et al. Handling imbalanced datasets: A review , 2006 .
[42] Peter Tiño,et al. Managing Diversity in Regression Ensembles , 2005, J. Mach. Learn. Res..
[43] Victor S. Sheng,et al. Cost-Sensitive Learning and the Class Imbalance Problem , 2008 .
[44] Nathalie Japkowicz,et al. The class imbalance problem: A systematic study , 2002, Intell. Data Anal..
[45] Nitesh V. Chawla,et al. SMOTEBoost: Improving Prediction of the Minority Class in Boosting , 2003, PKDD.
[46] Foster J. Provost,et al. Learning When Training Data are Costly: The Effect of Class Distribution on Tree Induction , 2003, J. Artif. Intell. Res..
[47] Herna L. Viktor,et al. Learning from imbalanced data sets with boosting and data generation: the DataBoost-IM approach , 2004, SKDD.
[48] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[49] Hui Han,et al. Borderline-SMOTE: A New Over-Sampling Method in Imbalanced Data Sets Learning , 2005, ICIC.
[50] Nitesh V. Chawla,et al. Exploiting Diversity in Ensembles: Improving the Performance on Unbalanced Datasets , 2007, MCS.
[51] Xin Yao,et al. Diversity analysis on imbalanced data sets by using ensemble models , 2009, 2009 IEEE Symposium on Computational Intelligence and Data Mining.
[52] Yang Wang,et al. Boosting for Learning Multiple Classes with Imbalanced Class Distribution , 2006, Sixth International Conference on Data Mining (ICDM'06).
[53] Xin Yao,et al. Evolving a cooperative population of neural networks by minimizing mutual information , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).
[54] Xin Yao,et al. Ensemble Learning Using Multi-Objective Evolutionary Algorithms , 2006, J. Math. Model. Algorithms.
[55] Xin Yao,et al. Diversity exploration and negative correlation learning on imbalanced data sets , 2009, 2009 International Joint Conference on Neural Networks.
[56] Stan Matwin,et al. Addressing the Curse of Imbalanced Training Sets: One-Sided Selection , 1997, ICML.
[57] Gary M. Weiss. Mining with rarity: a unifying framework , 2004, SKDD.
[58] Zhi-Hua Zhou,et al. ON MULTI‐CLASS COST‐SENSITIVE LEARNING , 2006, Comput. Intell..
[59] Taghi M. Khoshgoftaar,et al. Experimental perspectives on learning from imbalanced data , 2007, ICML '07.
[60] Ralescu Anca,et al. ISSUES IN MINING IMBALANCED DATA SETS - A REVIEW PAPER , 2005 .