A Boosting-Based Prototype Weighting and Selection Scheme
暂无分享,去创建一个
[1] J. Ross Quinlan,et al. Bagging, Boosting, and C4.5 , 1996, AAAI/IAAI, Vol. 1.
[2] Peter E. Hart,et al. Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.
[3] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[4] David W. Aha,et al. A study of instance-based algorithms for supervised learning tasks: mathematical, empirical, and psychological evaluations , 1990 .
[5] Ron Kohavi,et al. Irrelevant Features and the Subset Selection Problem , 1994, ICML.
[6] C. G. Hilborn,et al. The Condensed Nearest Neighbor Rule , 1967 .
[7] Pat Langley,et al. Selection of Relevant Features and Examples in Machine Learning , 1997, Artif. Intell..
[8] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.
[9] Wray L. Buntine,et al. A further comparison of splitting rules for decision-tree induction , 2004, Machine Learning.
[10] Leo Breiman,et al. Classification and Regression Trees , 1984 .
[11] David B. Skalak,et al. Prototype and Feature Selection by Sampling and Random Mutation Hill Climbing Algorithms , 1994, ICML.
[12] Yoram Singer,et al. Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.
[13] G. Gates. The Reduced Nearest Neighbor Rule , 1998 .
[14] Jianping Zhang,et al. Selecting Typical Instances in Instance-Based Learning , 1992, ML.