Dynamic feature selection algorithm based on Q-learning mechanism

Feature selection is a technique to improve the classification accuracy of classifiers and a convenient data visualization method. As an incremental, task oriented, and model-free learning algorithm, Q-learning is suitable for feature selection, this study proposes a dynamic feature selection algorithm, which combines feature selection and Q-learning into a framework. First, the Q-learning is used to construct the discriminant functions for each class of the data. Next, the feature ranking is achieved according to the all discrimination functions vectors for each class of the data comprehensively, and the feature ranking is doing during the process of updating discriminant function vectors. Finally, experiments are designed to compare the performance of the proposed algorithm with four feature selection algorithms, the experimental results on the benchmark data set verify the effectiveness of the proposed algorithm, the classification performance of the proposed algorithm is better than the other feature selection algorithms, meanwhile the proposed algorithm also has good performance in removing the redundant features, and the experiments of the effect of learning rates on the our algorithm demonstrate that the selection of parameters in our algorithm is very simple.

[1]  Koray Kavukcuoglu,et al.  Visual Attention , 2020, Computational Models for Cognitive Vision.

[2]  Zhihui Li,et al.  Differential evolution based on reinforcement learning with fitness ranking for solving multimodal multiobjective problems , 2019, Swarm Evol. Comput..

[3]  Christian Osendorfer,et al.  Minimizing data consumption with sequential online feature selection , 2013, Int. J. Mach. Learn. Cybern..

[4]  Xuelong Li,et al.  Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection , 2014, IEEE Transactions on Cybernetics.

[5]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[6]  Glenn Fung,et al.  Data selection for support vector machine classifiers , 2000, KDD '00.

[7]  Lei Wang,et al.  Global and Local Structure Preservation for Feature Selection , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Xindong Wu,et al.  Online feature selection for high-dimensional class-imbalanced data , 2017, Knowl. Based Syst..

[9]  Umberto Castellani,et al.  Infinite Latent Feature Selection: A Probabilistic Latent Graph-Based Ranking Approach , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[10]  Liang Du,et al.  Unsupervised feature selection with adaptive multiple graph learning , 2020, Pattern Recognit..

[11]  Hualong Yu,et al.  Rough set based semi-supervised feature selection via ensemble selector , 2019, Knowl. Based Syst..

[12]  Josef Kittler,et al.  Pattern recognition : a statistical approach , 1982 .

[13]  Yifan Li,et al.  Accelerating deep reinforcement learning model for game strategy , 2020, Neurocomputing.

[14]  Hua Huang,et al.  Neighbor embedding based super-resolution algorithm through edge detection and feature selection , 2009, Pattern Recognit. Lett..

[15]  Qi Wang,et al.  Nonnegative Laplacian embedding guided subspace learning for unsupervised feature selection , 2019, Pattern Recognit..

[16]  Heng Tao Shen,et al.  Half-Quadratic Minimization for Unsupervised Feature Selection on Incomplete Data , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[17]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[18]  Fuhui Long,et al.  Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Kilian Stoffel,et al.  Theoretical Comparison between the Gini Index and Information Gain Criteria , 2004, Annals of Mathematics and Artificial Intelligence.

[20]  Ali Hamzeh,et al.  Using reinforcement learning to find an optimal set of features , 2013, Comput. Math. Appl..

[21]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[22]  Feng Chu,et al.  A General Wrapper Approach to Selection of Class-Dependent Features , 2008, IEEE Transactions on Neural Networks.

[23]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[24]  Shulin Wang,et al.  Feature selection in machine learning: A new perspective , 2018, Neurocomputing.

[25]  Jacek M. Zurada,et al.  Normalized Mutual Information Feature Selection , 2009, IEEE Transactions on Neural Networks.

[26]  Hiroshi Motoda,et al.  Feature Selection for Knowledge Discovery and Data Mining , 1998, The Springer International Series in Engineering and Computer Science.

[27]  Jian Sun,et al.  Face Alignment by Explicit Shape Regression , 2012, International Journal of Computer Vision.

[28]  Ferat Sahin,et al.  A survey on feature selection methods , 2014, Comput. Electr. Eng..

[29]  Feiping Nie,et al.  Multiple view semi-supervised dimensionality reduction , 2010, Pattern Recognit..

[30]  K. Müller,et al.  An adaptive deep reinforcement learning framework enables curling robots with human-like performance in real-world conditions , 2020, Science Robotics.

[31]  Rong Jin,et al.  Online Feature Selection and Its Applications , 2014, IEEE Transactions on Knowledge and Data Engineering.

[32]  Jing Wang,et al.  Local linear transformation embedding , 2009, Neurocomputing.

[33]  Sayan Mukherjee,et al.  Feature Selection for SVMs , 2000, NIPS.

[34]  Li Zhao,et al.  Reinforcement Learning for Relation Classification From Noisy Data , 2018, AAAI.

[35]  Michèle Sebag,et al.  Feature Selection as a One-Player Game , 2010, ICML.

[36]  Hao Wang,et al.  Online Streaming Feature Selection , 2010, ICML.

[37]  Jason Weston,et al.  Gene Selection for Cancer Classification using Support Vector Machines , 2002, Machine Learning.

[38]  Feiping Nie,et al.  Effective Discriminative Feature Selection With Nontrivial Solution , 2015, IEEE Transactions on Neural Networks and Learning Systems.