Feature selection algorithm based on bare bones particle swarm optimization

Abstract Feature selection is a useful pre-processing technique for solving classification problems. As an almost parameter-free optimization algorithm, the bare bones particle swarm optimization (BPSO) has been applied to the topic of optimization on continuous or integer spaces, but it has not been applied to feature selection problems with binary variables. In this paper, we propose a new method to find optimal feature subset by the BPSO, called the binary BPSO. In this algorithm, a reinforced memory strategy is designed to update the local leaders of particles for avoiding the degradation of outstanding genes in the particles, and a uniform combination is proposed to balance the local exploitation and the global exploration of algorithm. Moreover, the 1-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. Some international standard data sets are selected to evaluate the proposed algorithm. The experiments show that the proposed algorithm is competitive in terms of both classification accuracy and computational performance.

[1]  Alper Ekrem Murat,et al.  A discrete particle swarm optimization method for feature selection in binary classification problems , 2010, Eur. J. Oper. Res..

[2]  Jaekyung Yang,et al.  Optimization-based feature selection with adaptive instance sampling , 2006, Comput. Oper. Res..

[3]  Mehmet Fatih Tasgetiren,et al.  A discrete particle swarm optimization algorithm for the no-wait flowshop scheduling problem , 2008, Comput. Oper. Res..

[4]  Patrick Siarry,et al.  An adaptive multiobjective particle swarm optimization algorithm , 2011 .

[5]  Li-Yeh Chuang,et al.  Improved binary particle swarm optimization using catfish effect for feature selection , 2011, Expert Syst. Appl..

[6]  Li-Yeh Chuang,et al.  Chaotic maps based on binary particle swarm optimization for feature selection , 2011, Appl. Soft Comput..

[7]  Li-Yeh Chuang,et al.  Gene selection and classification using Taguchi chaotic binary particle swarm optimization , 2011, Expert Syst. Appl..

[8]  H. B. Mann,et al.  On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other , 1947 .

[9]  El-Ghazali Talbi,et al.  Comparison of population based metaheuristics for feature selection: Application to microarray data classification , 2008, 2008 IEEE/ACS International Conference on Computer Systems and Applications.

[10]  Devid Desfreed Kennedy,et al.  Novel bare-bones particle swarm optimization and its performance for modeling vapor–liquid equilibrium data , 2011 .

[11]  J. L. Hodges,et al.  Discriminatory Analysis - Nonparametric Discrimination: Consistency Properties , 1989 .

[12]  Dun-Wei Gong,et al.  A bare-bones multi-objective particle swarm optimization algorithm for environmental/economic dispatch , 2012, Inf. Sci..

[13]  Yahya Slimani,et al.  Adaptive Particle Swarm Optimizer for Feature Selection , 2010, IDEAL.

[14]  Russell C. Eberhart,et al.  An analysis of Bare Bones Particle Swarm , 2008, 2008 IEEE Swarm Intelligence Symposium.

[15]  Yue Shi,et al.  A modified particle swarm optimizer , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[16]  Sanghamitra Bandyopadhyay,et al.  Multi-Objective Particle Swarm Optimization with time variant inertia and acceleration coefficients , 2007, Inf. Sci..

[17]  Maurice Clerc,et al.  The particle swarm - explosion, stability, and convergence in a multidimensional complex space , 2002, IEEE Trans. Evol. Comput..

[18]  J. Koenderink Q… , 2014, Les noms officiels des communes de Wallonie, de Bruxelles-Capitale et de la communaute germanophone.

[19]  Ling Wang,et al.  No-idle permutation flow shop scheduling based on a hybrid discrete particle swarm optimization algorithm , 2008 .

[20]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[21]  Zne-Jung Lee,et al.  Parameter determination of support vector machine and feature selection using simulated annealing approach , 2008, Appl. Soft Comput..

[22]  Andries Petrus Engelbrecht,et al.  A study of particle swarm optimization particle trajectories , 2006, Inf. Sci..

[23]  Chao-Ton Su,et al.  Applying electromagnetism-like mechanism for feature selection , 2011, Inf. Sci..

[24]  Cheng-Lung Huang,et al.  ACO-based hybrid classification system with feature subset selection and model parameters optimization , 2009, Neurocomputing.

[25]  Xiangyang Wang,et al.  Feature selection based on rough sets and particle swarm optimization , 2007, Pattern Recognit. Lett..

[26]  James Kennedy,et al.  Particle swarm optimization , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[27]  Andries Petrus Engelbrecht,et al.  Barebones Particle Swarm for Integer Programming Problems , 2007, 2007 IEEE Swarm Intelligence Symposium.

[28]  Maurice Clerc,et al.  MO-TRIBES, an adaptive multiobjective particle swarm optimization algorithm , 2011, Comput. Optim. Appl..

[29]  Josef Kittler,et al.  Fast branch & bound algorithms for optimal feature selection , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[30]  Byung Ro Moon,et al.  Hybrid Genetic Algorithms for Feature Selection , 2004, IEEE Trans. Pattern Anal. Mach. Intell..

[31]  Jun Zhang,et al.  Optimizing the Vehicle Routing Problem With Time Windows: A Discrete Particle Swarm Optimization Approach , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[32]  Russell C. Eberhart,et al.  A discrete binary version of the particle swarm algorithm , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.

[33]  James Kennedy,et al.  Bare bones particle swarms , 2003, Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS'03 (Cat. No.03EX706).