Improved Crowding Distance in Multi-objective Optimization for Feature Selection in Classification

Feature selection is an essential preprocessing step in data mining and machine learning. A feature selection task can be treated as a multi-objective optimization problem which simultaneously minimizes the classification error and the number of selected features. Many existing feature selection approaches including multi-objective methods neglect that there exists multiple optimal solutions in feature selection. There can be multiple different optimal feature subsets which achieve the same or similar classification performance. Furthermore, when using evolutionary multi-objective optimization for feature selection, a crowding distance metric is typically used to play a role in environmental selection. However, some existing calculations of crowding metrics based on continuous/numeric values are inappropriate for feature selection since the search space of feature selection is discrete. Therefore, this paper proposes a new environmental selection method to modify the calculation of crowding metrics. The proposed approach is expected to help a multi-objective feature selection algorithm to find multiple potential optimal feature subsets. Experiments on sixteen different datasets of varying difficulty show that the proposed approach can find more diverse feature subsets, achieving the same classification performance without deteriorating performance regarding hypervolume and inverted generational distance.

[1]  David E. Goldberg,et al.  The compact genetic algorithm , 1999, IEEE Trans. Evol. Comput..

[2]  Gary G. Yen,et al.  Many-Objective Evolutionary Algorithms Based on Coordinated Selection Strategy , 2017, IEEE Transactions on Evolutionary Computation.

[3]  Yaochu Jin,et al.  A Many-Objective Evolutionary Algorithm Using A One-by-One Selection Strategy , 2017, IEEE Transactions on Cybernetics.

[4]  Xin Yao,et al.  A Survey on Evolutionary Computation Approaches to Feature Selection , 2016, IEEE Transactions on Evolutionary Computation.

[5]  Kalyanmoy Deb,et al.  A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation: NSGA-II , 2000, PPSN.

[6]  Huan Liu,et al.  Feature Selection: An Ever Evolving Frontier in Data Mining , 2010, FSDM.

[7]  Fakhri Karray,et al.  Multi-objective Feature Selection with NSGA II , 2007, ICANNGA.

[8]  Jing J. Liang,et al.  A Multiobjective Particle Swarm Optimizer Using Ring Topology for Solving Multimodal Multiobjective Problems , 2018, IEEE Transactions on Evolutionary Computation.

[9]  Hisao Ishibuchi,et al.  Multiple Reference Points-Based Decomposition for Multiobjective Feature Selection in Classification: Static and Dynamic Mechanisms , 2020, IEEE Transactions on Evolutionary Computation.

[10]  Jing J. Liang,et al.  A novel multiobjective optimization algorithm for sparse signal reconstruction , 2020, Signal Process..

[11]  Jing J. Liang,et al.  Multimodal Multiobjective Optimization in Feature Selection , 2019, 2019 IEEE Congress on Evolutionary Computation (CEC).

[12]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[13]  R. Lyndon While,et al.  A faster algorithm for calculating hypervolume , 2006, IEEE Transactions on Evolutionary Computation.

[14]  Keith W. Hipel,et al.  A hybrid project portfolio selection procedure with historical performance consideration , 2020, Expert Syst. Appl..

[15]  Hisao Ishibuchi,et al.  Modified Distance Calculation in Generational Distance and Inverted Generational Distance , 2015, EMO.

[16]  Hiroshi Motoda,et al.  Feature Extraction, Construction and Selection: A Data Mining Perspective , 1998 .

[17]  Bing Xue,et al.  A Duplication Analysis-Based Evolutionary Algorithm for Biobjective Feature Selection , 2020, IEEE Transactions on Evolutionary Computation.

[18]  Mengjie Zhang,et al.  Variable-Length Particle Swarm Optimization for Feature Selection on High-Dimensional Classification , 2019, IEEE Transactions on Evolutionary Computation.

[19]  Hiroshi Motoda,et al.  Feature Extraction, Construction and Selection , 1998 .

[20]  Mengjie Zhang,et al.  Particle Swarm Optimisation with genetic operators for feature selection , 2017, 2017 IEEE Congress on Evolutionary Computation (CEC).

[21]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[22]  C.A. Coello Coello,et al.  MOPSO: a proposal for multiple objective particle swarm optimization , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[23]  Riccardo Poli,et al.  Particle swarm optimization , 1995, Swarm Intelligence.

[24]  Mengjie Zhang,et al.  Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective Approach , 2013, IEEE Transactions on Cybernetics.

[25]  Josef Kittler,et al.  Floating search methods in feature selection , 1994, Pattern Recognit. Lett..