Is regularization unnecessary for boosting?
暂无分享,去创建一个
[1] J. Friedman. Stochastic gradient boosting , 2002 .
[2] Wenxin Jiang. On weak base hypotheses and their implications for boosting regression and classification , 2002 .
[3] L. Breiman. USING ADAPTIVE BAGGING TO DEBIAS REGRESSIONS , 1999 .
[4] Robert A. Jacobs,et al. Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.
[5] J. Friedman. Greedy function approximation: A gradient boosting machine. , 2001 .
[6] Geoffrey E. Hinton,et al. Adaptive Mixtures of Local Experts , 1991, Neural Computation.
[7] Yuhong Yang,et al. Minimax Nonparametric Classification—Part I: Rates of Convergence , 1998 .
[8] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.
[9] Thomas Richardson,et al. Boosting methodology for regression problems , 1999, AISTATS.
[10] Wenxin Jiang. Does Boosting Over t: Views From an Exact Solution , 2000 .
[11] László Györfi,et al. A Probabilistic Theory of Pattern Recognition , 1996, Stochastic Modelling and Applied Probability.
[12] J. Friedman. Special Invited Paper-Additive logistic regression: A statistical view of boosting , 2000 .
[13] Dale Schuurmans,et al. Boosting in the Limit: Maximizing the Margin of Learned Ensembles , 1998, AAAI/IAAI.
[14] D. L. Donoho,et al. Ideal spacial adaptation via wavelet shrinkage , 1994 .
[15] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[16] Nouna Kettaneh,et al. Statistical Modeling by Wavelets , 1999, Technometrics.
[17] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[18] Robert E. Schapire,et al. Theoretical Views of Boosting , 1999, EuroCOLT.