Feature Selection with Non-Parametric Mutual Information for Adaboost Learning

This paper describes a feature selection method based on the quadratic mutual information. We describe the needed formulation to estimate the mutual information from the data. This paper is motivated for the high time cost of the training process using the classical boosting algorithms. This method allows to reuse part of the training time used in the first training process to speed up posterior training to update the detectors in front of samples changes.

[1]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[2]  C. D. Kemp,et al.  Density Estimation for Statistics and Data Analysis , 1987 .

[3]  A. Rényi On Measures of Entropy and Information , 1961 .

[4]  R. Hartley Transmission of information , 1928 .

[5]  Rainer Lienhart,et al.  An extended set of Haar-like features for rapid object detection , 2002, Proceedings. International Conference on Image Processing.

[6]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[7]  Anil K. Jain,et al.  39 Dimensionality and sample size considerations in pattern recognition practice , 1982, Classification, Pattern Recognition and Reduction of Dimensionality.

[8]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[9]  Jagat Narain Kapur,et al.  Measures of information and their applications , 1994 .

[10]  Jordi Vitrià,et al.  Fast Traffic Sign Detection on greyscale images , 2004 .

[11]  William M. Campbell,et al.  Mutual Information in Learning Feature Transformations , 2000, ICML.