The Most Robust Loss Function for Boosting
暂无分享,去创建一个
[1] Takafumi Kanamori,et al. Information Geometry of U-Boost and Bregman Divergence , 2004, Neural Computation.
[2] Osamu Watanabe,et al. MadaBoost: A Modification of AdaBoost , 2000, COLT.
[3] Gunnar Rätsch,et al. Soft Margins for AdaBoost , 2001, Machine Learning.
[4] Y. Freund,et al. Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By , 2000 .
[5] Vladimir N. Vapnik,et al. The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.
[6] John Law,et al. Robust Statistics—The Approach Based on Influence Functions , 1986 .
[7] Peter L. Bartlett,et al. Boosting Algorithms as Gradient Descent , 1999, NIPS.
[8] John D. Lafferty,et al. Boosting and Maximum Likelihood for Exponential Models , 2001, NIPS.
[9] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.
[10] J. Copas. Binary Regression Models for Contaminated Data , 1988 .
[11] Shinto Eguchi,et al. Robustifying AdaBoost by Adding the Naive Error Rate , 2004, Neural Computation.