3 IMBALANCED DATASETS: FROM SAMPLING TO CLASSIFIERS
暂无分享,去创建一个
[1] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[2] Taeho Jo,et al. Class imbalances versus small disjuncts , 2004, SKDD.
[3] David A. Cieslak,et al. Automatically countering imbalance and its empirical relationship to cost , 2008, Data Mining and Knowledge Discovery.
[4] James P. Egan,et al. Signal detection theory and ROC analysis , 1975 .
[5] Gustavo E. A. P. A. Batista,et al. A study of the behavior of several methods for balancing machine learning training data , 2004, SKDD.
[6] Alberto Maria Segre,et al. Programs for Machine Learning , 1994 .
[7] Herna L. Viktor,et al. Learning from imbalanced data sets with boosting and data generation: the DataBoost-IM approach , 2004, SKDD.
[8] Yunqian Ma,et al. Imbalanced Datasets: From Sampling to Classifiers , 2013 .
[9] Mark Goadrich,et al. The relationship between Precision-Recall and ROC curves , 2006, ICML.
[10] Pedro M. Domingos,et al. Tree Induction for Probability-Based Ranking , 2003, Machine Learning.
[11] Chumphol Bunkhumpornpat,et al. Safe-Level-SMOTE: Safe-Level-Synthetic Minority Over-Sampling TEchnique for Handling the Class Imbalanced Problem , 2009, PAKDD.
[12] Fredric C. Gey,et al. The Relationship between Recall and Precision , 1994, J. Am. Soc. Inf. Sci..
[13] Stan Matwin,et al. Addressing the Curse of Imbalanced Training Sets: One-Sided Selection , 1997, ICML.
[14] David A. Cieslak,et al. Learning Decision Trees for Unbalanced Data , 2008, ECML/PKDD.
[15] I. Tomek. An Experiment with the Edited Nearest-Neighbor Rule , 1976 .
[16] Eric Bauer,et al. An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.
[17] N. Japkowicz. Learning from Imbalanced Data Sets: A Comparison of Various Strategies * , 2000 .
[18] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[19] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[20] Hui Han,et al. Borderline-SMOTE: A New Over-Sampling Method in Imbalanced Data Sets Learning , 2005, ICIC.
[21] Zhi-Hua Zhou,et al. Exploratory Under-Sampling for Class-Imbalance Learning , 2006, ICDM.
[22] Chumphol Bunkhumpornpat,et al. DBSMOTE: Density-Based Synthetic Minority Over-sampling TEchnique , 2011, Applied Intelligence.
[23] Tom Fawcett,et al. Robust Classification for Imprecise Environments , 2000, Machine Learning.
[24] Nitesh V. Chawla,et al. SMOTEBoost: Improving Prediction of the Minority Class in Boosting , 2003, PKDD.
[25] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[26] J A Swets,et al. Measuring the accuracy of diagnostic systems. , 1988, Science.
[27] Oleksandr Makeyev,et al. Neural network with ensembles , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).
[28] Kai Ming Ting,et al. A Comparative Study of Cost-Sensitive Boosting Algorithms , 2000, ICML.
[29] Jorma Laurikkala,et al. Improving Identification of Difficult Small Classes by Balancing Class Distribution , 2001, AIME.
[30] Nitesh V. Chawla,et al. SMOTE: Synthetic Minority Over-sampling Technique , 2002, J. Artif. Intell. Res..
[31] Nitesh V. Chawla,et al. Classification and knowledge discovery in protein databases , 2004, J. Biomed. Informatics.
[32] David J. Hand,et al. A Simple Generalisation of the Area Under the ROC Curve for Multiple Class Classification Problems , 2001, Machine Learning.
[33] David A. Cieslak,et al. Hellinger distance decision trees are robust and skew-insensitive , 2011, Data Mining and Knowledge Discovery.
[34] Nils J. Nilsson,et al. A Formal Basis for the Heuristic Determination of Minimum Cost Paths , 1968, IEEE Trans. Syst. Sci. Cybern..
[35] Hisashi Kashima,et al. Roughly balanced bagging for imbalanced data , 2009, Stat. Anal. Data Min..
[36] Andrew P. Bradley,et al. The use of the area under the ROC curve in the evaluation of machine learning algorithms , 1997, Pattern Recognit..