Boosting is a versatile machine learning technique that has numerous applications including but not limited to image processing, computer vision, data mining etc. It is based on the premise that the classification performance of a set of weak learners can be boosted by some weighted combination of them. There have been a number of boosting methods proposed in the literature, such as the AdaBoost, LPBoost, SoftBoost and their variations. However, the learning update strategies used in these methods usually lead to overfitting and instabilities in the classification accuracy. Improved boosting methods via regularization can overcome such difficulties. In this paper, we propose a Riemannian distance regularized LPBoost, dubbed RBoost. RBoost uses Riemannian distance between two square-root densities (in closed form) - used to represent the distribution over the training data and the classification error respectively - to regularize the error distribution in an iterative update formula. Since this distance is in closed form, RBoost requires much less computational cost compared to other regularized Boosting algorithms. We present several experimental results depicting the performance of our algorithm in comparison to recently published methods, LPBoost and CAVIAR, on a variety of datasets including the publicly available OASIS database, a home grown Epilepsy database and the well known UCI repository. Results depict that the RBoost algorithm performs better than the competing methods in terms of accuracy and efficiency.
[1]
Gunnar Rätsch,et al.
Boosting Algorithms for Maximizing the Soft Margin
,
2007,
NIPS.
[2]
Y. Freund,et al.
Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By
,
2000
.
[3]
Guido Gerig,et al.
Unbiased diffeomorphic atlas construction for computational anatomy
,
2004,
NeuroImage.
[4]
S. V. N. Vishwanathan,et al.
Entropy Regularized LPBoost
,
2008,
ALT.
[5]
Yoram Singer,et al.
Improved Boosting Algorithms Using Confidence-rated Predictions
,
1998,
COLT' 98.
[6]
John G. Csernansky,et al.
Open Access Series of Imaging Studies (OASIS): Cross-sectional MRI Data in Young, Middle Aged, Nondemented, and Demented Older Adults
,
2007,
Journal of Cognitive Neuroscience.
[7]
Ayhan Demiriz,et al.
Linear Programming Boosting via Column Generation
,
2002,
Machine Learning.
[8]
Ting Chen,et al.
Caviar: Classification via aggregated regression and its application in classifying oasis brain database
,
2010,
2010 IEEE International Symposium on Biomedical Imaging: From Nano to Macro.
[9]
Stephen P. Boyd,et al.
Convex Optimization
,
2004,
Algorithms and Theory of Computation Handbook.
[10]
Anand Rangarajan,et al.
Kernel Fisher discriminant for shape-based classification in epilepsy
,
2007,
Medical Image Anal..