Particle swarm optimization for prototype reduction

The problem addressed in this paper concerns the prototype reduction for a nearest-neighbor classifier. An efficient method based on particle swarm optimization is proposed here for finding a good set of prototypes. Starting from an initial random selection of a small number of training patterns, we generate a set of prototypes, using the particle swarm optimization, which minimizes the error rate on the training set. To improve the classification performance, during the training phase the prototype generation is repeated N times, then each of the resulting N sets of prototypes is used to classify each test pattern, and finally these N classification results are combined by the ''vote rule''. The performance improvement with respect to the state-of-the-art approaches is validated through experiments with several benchmark datasets.

[1]  Ludmila I. Kuncheva,et al.  Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.

[2]  J. Kennedy,et al.  Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[3]  Francisco Herrera,et al.  On the combination of evolutionary algorithms and stratified strategies for training set selection in data mining , 2006, Appl. Soft Comput..

[4]  Thorsteinn S. Rögnvaldsson,et al.  Why neural networks should not be used for HIV-1 protease cleavage site prediction , 2004, Bioinform..

[5]  Enrique Vidal,et al.  Learning prototypes and distances (LPD). A prototype reduction technique based on nearest neighbor error minimization , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[6]  Carlos Eduardo Pedreira,et al.  Learning vector quantization with training data selection , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[8]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[9]  James C. Bezdek,et al.  Nearest prototype classifier designs: An experimental study , 2001, Int. J. Intell. Syst..

[10]  Riccardo Poli,et al.  Particle swarm optimization , 1995, Swarm Intelligence.

[11]  Ahmed Bouridane,et al.  Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier , 2007, Pattern Recognit. Lett..

[12]  Francisco Herrera,et al.  Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study , 2003, IEEE Trans. Evol. Comput..

[13]  Zheng-Zhi Wang,et al.  Center-based nearest neighbor classifier , 2007, Pattern Recognit..

[14]  Xiangyang Wang,et al.  Feature selection based on rough sets and particle swarm optimization , 2007, Pattern Recognit. Lett..

[15]  Peter E. Hart,et al.  The condensed nearest neighbor rule (Corresp.) , 1968, IEEE Trans. Inf. Theory.

[16]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .