Some aspects of error bounds in feature selection

Abstract In this paper we discuss various bounds on the Bayesian probability of error, which are used for feature selection, and are based on distance measures and information measures. We show that they are basically of two types. One type can be related to the f -divergence, the other can be related to information measures. This also clarifies some properties of these measures for the two-class problem and for the multiclass problem. We give some general bounds on the Bayesian probability of error and discuss various aspects of the different approaches.