Boosted Bayesian Kernel Classifier Method for Face Detection

In this paper, we present a novel face detection approach based on adaboosted relevance vector machine (RVM). The novelty of this paper comes from the construction of the kernel classifier with different kernel parameters. We use Fisher's criterion to choose a subset of Haar-like features. The proposed combination outperforms in generalization in comparison with support vector machine (SVM) on imbalanced classification problem. The combination of boosting algorithm and RVM classifier will yield accurate and sparse model which will perform well in real-time application. This method is compared, in terms of classification accuracy, to other commonly used methods, such as SVM and RVM without boosting, on CBCL face database. Results indicate that the performance of the proposed method is overall superior to previous approaches with very good sparsity.

[1]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevan e Ve tor Ma hine , 2001 .

[2]  Takeo Kanade,et al.  Neural Network-Based Face Detection , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Ludmila I. Kuncheva,et al.  Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse , 2002, Multiple Classifier Systems.

[4]  Rainer Lienhart,et al.  Empirical Analysis of Detection Cascades of Boosted Classifiers for Rapid Object Detection , 2003, DAGM-Symposium.

[5]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[6]  Bernardete Ribeiro,et al.  Two-Level Hierarchical Hybrid SVM-RVM Classification Model , 2006, 2006 5th International Conference on Machine Learning and Applications (ICMLA'06).

[7]  D. Opitz,et al.  Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..

[8]  Michael E. Tipping,et al.  Fast Marginal Likelihood Maximisation for Sparse Bayesian Models , 2003 .

[9]  G DietterichThomas An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees , 2000 .

[10]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[11]  Raymond J. Mooney,et al.  Creating diversity in ensembles using artificial data , 2005, Inf. Fusion.

[12]  Tomaso A. Poggio,et al.  A general framework for object detection , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[13]  Shouxian Cheng,et al.  Improved feature reduction in input and feature spaces , 2005, Pattern Recognit..

[14]  Thomas G. Dietterich An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.

[15]  Robert E. Schapire,et al.  The Boosting Approach to Machine Learning An Overview , 2003 .

[16]  Yoshua Bengio,et al.  Boosting Neural Networks , 2000, Neural Computation.

[17]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[18]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[19]  Stan Z. Li,et al.  FloatBoost learning and statistical face detection , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[21]  Yoav Freund,et al.  Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.

[22]  KohaviRon,et al.  An Empirical Comparison of Voting Classification Algorithms , 1999 .

[23]  Y. Freund,et al.  Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By , 2000 .

[24]  Eric Bauer,et al.  An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.