Jensen-Shannon boosting learning for object recognition

In this paper, we propose a novel learning method, called Jensen-Shannon Boosting (JSBoost) and demonstrate its application to object recognition. JSBoost incorporates Jensen-Shannon (JS) divergence [Y. Bubner et al. (2001)] into AdaBoost learning. JS divergence is advantageous in that it provides more appropriate measure of dissimilarity between two classes and it is numerically more stable than other measures such as Kullback-Leibler (KL) divergence (see [Y. Bubner et al. (2001)]). The best features are iteratively learned by maximizing the projected JS divergence, based on which best weak classifiers are derived. The weak classifiers are combined into a strong one by minimizing the recognition error. JSBoost learning is demonstrated with face object recognition using a local binary pattern (LBP) [M. Pietikainen et al. (2004)] based representation. JSBoost selects the best LBP features from thousands of candidate features and constructs a strong classifier based on the selected features. JSBoost empirically produces better face recognition results than other AdaBoost variants such as RealBoost [R.E. Schapire et al. (1998)], GentleBoost [J. Friedman et al. (2000)] and KL-Boost [C. Liu et al. (2003)], as demonstrated by experiments.

[1]  Igor Vajda,et al.  Note on discrimination information and variation (Corresp.) , 1970, IEEE Trans. Inf. Theory.

[2]  Demetrios Kazakos,et al.  A Decision Theory Approach to the Approximation of Discrete Probability Densities , 1980, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Jianhua Lin,et al.  Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.

[4]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[5]  Alex Pentland,et al.  A Bayesian similarity measure for direct image matching , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[6]  Matti Pietikäinen,et al.  A comparative study of texture measures with classification based on featured distributions , 1996, Pattern Recognit..

[7]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[8]  Joachim M. Buhmann,et al.  Empirical evaluation of dissimilarity measures for color and texture , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[9]  Y. Freund,et al.  Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By , 2000 .

[10]  Joachim M. Buhmann,et al.  Empirical Evaluation of Dissimilarity Measures for Color and Texture , 2001, Comput. Vis. Image Underst..

[11]  Matti Pietikäinen,et al.  Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Harry Shum,et al.  Kullback-Leibler boosting , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[13]  Jean-Philippe Thiran,et al.  The BANCA Database and Evaluation Protocol , 2003, AVBPA.

[14]  Matti Pietikäinen,et al.  Face Recognition with Local Binary Patterns , 2004, ECCV.

[15]  LinLin Shen,et al.  Face authentication test on the BANCA database , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[16]  B. K. Julsing,et al.  Face Recognition with Local Binary Patterns , 2012 .