KL-divergence Based Feature Selection Algorithm with the Separate-class Strategy

Feature selection is one of the core issues in designing pattern recognition systems and has attracted conside-rable attention in the literature.Most of the feature selection methods in the literature only handle relevance and redundancy analysis from the point of view of the whole class,which neglect the relation of features and the separate class labels.To this end,a novel KL-divergence based feature selection algorithm was proposed to explicitly handle the relevance and redundancy analysis for each class label with a separate-class strategy.A KL-divergence based metric of effective distance was also introduced in the algorithm to conduct the relevance and redundancy analysis.Experimental results show that the proposed algorithm is efficient and outperforms the three representative algorithms CFS,FCBF and ReliefF with respect to the quality of the selected feature subset.