Semi-Supervised Additive Logistic Regression: A Gradient Descent Solution

This paper describes a semi-supervised regularized method for additive logistic regression. The graph regularization term of the combined functions is added to the original cost functional used in AdaBoost. This term constrains the learned function to be smooth on a graph. Then the gradient solution is computed with the advantage that the regularization parameter can be adaptively selected. Finally, the func- tion step-size of each iteration can be computed using Newton-Raphson iteration. Experiments on bench- mark data sets show that the algorithm gives better results than existing methods.

[1]  Wenxin Jiang,et al.  Is regularization unnecessary for boosting? , 2001, AISTATS.

[2]  Zoubin Ghahramani,et al.  Combining active learning and semi-supervised learning using Gaussian fields and harmonic functions , 2003, ICML 2003.

[3]  J. Friedman Special Invited Paper-Additive logistic regression: A statistical view of boosting , 2000 .

[4]  Ji Zhu,et al.  Boosting as a Regularized Path to a Maximum Margin Classifier , 2004, J. Mach. Learn. Res..

[5]  Lorenzo Rosasco,et al.  Manifold Regularization , 2007 .

[6]  Ayhan Demiriz,et al.  Exploiting unlabeled data in ensemble methods , 2002, KDD.

[7]  Xiaojin Zhu,et al.  --1 CONTENTS , 2006 .

[8]  Eric R. Ziegel,et al.  The Elements of Statistical Learning , 2003, Technometrics.

[9]  Gunnar Rätsch,et al.  Advanced Lectures on Machine Learning , 2004, Lecture Notes in Computer Science.

[10]  Bernhard Schölkopf,et al.  Learning with Local and Global Consistency , 2003, NIPS.

[11]  Balázs Kégl,et al.  Boosting on Manifolds: Adaptive Regularization of Base Classifiers , 2004, NIPS.

[12]  Christophe Ambroise,et al.  Semi-supervised MarginBoost , 2001, NIPS.

[13]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[14]  Peter L. Bartlett,et al.  Functional Gradient Techniques for Combining Hypotheses , 2000 .

[15]  Gunnar Rätsch,et al.  An Introduction to Boosting and Leveraging , 2002, Machine Learning Summer School.

[16]  Gunnar Rätsch,et al.  Soft Margins for AdaBoost , 2001, Machine Learning.

[17]  Mikhail Belkin,et al.  Towards a theoretical foundation for Laplacian-based manifold methods , 2005, J. Comput. Syst. Sci..

[18]  Alexander J. Smola,et al.  Advances in Large Margin Classifiers , 2000 .