A Simple Multi-Class Boosting Framework with Theoretical Guarantees and Empirical Proficiency

There is a need for simple yet accurate white-box learning systems that train quickly and with lit- tle data. To this end, we showcase REBEL, a multi-class boosting method, and present a novel family of weak learners called localized similar- ities. Our framework provably minimizes the training error of any dataset at an exponential rate. We carry out experiments on a variety of synthetic and real datasets, demonstrating a con- sistent tendency to avoid overfitting. We eval- uate our method on MNIST and standard UCI datasets against other state-of-the-art methods, showing the empirical proficiency of our method.

[1]  Yinda Zhang,et al.  LSUN: Construction of a Large-scale Image Dataset using Deep Learning with Humans in the Loop , 2015, ArXiv.

[2]  Anton van den Hengel,et al.  StructBoost: Boosting Methods for Predicting Structured Output Variables. , 2014, IEEE transactions on pattern analysis and machine intelligence.

[3]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[4]  Robert E. Schapire,et al.  A theory of multiclass boosting , 2010, J. Mach. Learn. Res..

[5]  Yoram Singer,et al.  Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers , 2000, J. Mach. Learn. Res..

[6]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[7]  Nuno Vasconcelos,et al.  Multiclass Boosting: Theory and Algorithms , 2011, NIPS.

[8]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1990, COLT '90.

[9]  Chunhua Shen,et al.  A direct formulation for totally-corrective multi-class boosting , 2011, CVPR 2011.

[10]  Peng Sun,et al.  AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem , 2012, ICML.

[11]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[12]  Ling Li,et al.  Multiclass boosting with repartitioning , 2006, ICML.

[13]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[14]  Pietro Perona,et al.  Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class , 2016, ArXiv.

[15]  Jian Li,et al.  Unifying the error-correcting and output-code AdaBoost within the margin framework , 2005, ICML.

[16]  Thomas G. Dietterich,et al.  Solving Multiclass Learning Problems via Error-Correcting Output Codes , 1994, J. Artif. Intell. Res..

[17]  A. Asuncion,et al.  UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences , 2007 .

[18]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.