Large-Scale Feedforward Neural Network Optimization by a Self-Adaptive Strategy and Parameter Based Particle Swarm Optimization

Feedforward neural network (FNN) is one of the most widely used and fastest-developed artificial neural networks. Much evolutionary computation (EC) methods have been used to optimize the weights of FNN. However, as the dimension of datasets increases, the number of weights also increases dramatically. On high-dimensional datasets, if EC methods are used directly to optimize the weights of FNN, it is impossible to obtain the optimal weights of the FNN by EC methods in an acceptable time. Feature selection is a method that can effectively reduce the computational complexity of FNN by reducing irrelevant and redundant features. It may be practical to optimize the FNN by EC methods if we first employ the feature selection for the large-scale datasets. In this paper, we present a self-adaptive parameter and strategy-based particle swarm optimization (SPS-PSO) algorithm to optimize FNN with feature selection. First, we propose an optimization model for FNN by transforming the designing of FNN into a weights optimization problem. Simultaneously, we present a feature selection optimization model. Second, we present an SPS-PSO algorithm. In this algorithm, we use the parameter and strategy self-adaptive mechanism. In addition, five candidate solution generating strategies (CSGS) are used. The experiments are divided into two groups. In the first group, SPS-PSO and three other EC methods are used to directly optimize the weights of FNN on eight datasets without any modification. In the second group, we first employ SPS-PSO-based feature selection on the original datasets and obtain eight relatively smaller datasets with the ${k}$ -nearest neighbor (KNN) which is used as the evaluation function for saving time. Then, we use the new datasets as the inputs for FNN. We optimize the weights of FNN again by SPS-PSO and three other EC methods to investigate whether we can get similar or even better classification accuracy by comparing the results with that of the first group. The experimental results show that SPS-PSO has the advantage in optimizing the weights of FNN compared with the other EC methods. Meanwhile, the SPS-PSO-based feature selection can reduce the solution size and computational complexity while ensuring the classification accuracy when it is used to preprocess the datasets for FNN. In this method, a solution with an originally higher than 700 000 dimensions can be even reduced to hundreds of dimensions.

[1]  Jing J. Liang,et al.  Comprehensive learning particle swarm optimizer for global optimization of multimodal functions , 2006, IEEE Transactions on Evolutionary Computation.

[2]  Yu Xue,et al.  A Self-Adaptive Fireworks Algorithm for Classification Problems , 2018, IEEE Access.

[3]  S. K. Nagar,et al.  Particle swarm optimization algorithm and its parameters: A review , 2016, 2016 International Conference on Control, Computing, Communication and Materials (ICCCCM).

[4]  Lawrence Davis,et al.  Training Feedforward Neural Networks Using Genetic Algorithms , 1989, IJCAI.

[5]  Yu Xue,et al.  An evolutionary classification method based on fireworks algorithm , 2018, Int. J. Bio Inspired Comput..

[6]  Feng Zhao,et al.  Multi-Slot Spectrum Auction in Heterogeneous Networks Based on Deep Feedforward Network , 2018, IEEE Access.

[7]  Pham The Bao,et al.  Liver Tumor Segmentation from MR Images Using 3D Fast Marching Algorithm and Single Hidden Layer Feedforward Neural Network , 2016, BioMed research international.

[8]  Yi Zhuang,et al.  Self-adaptive learning based discrete differential evolution algorithm for solving CJWTA problem , 2014 .

[9]  Zhou Zhi IMPROVING FAULT-TOLERANCE OF FEEDFORWARD NEURAL NETWORKS WITH GENETIC ALGORITHMS , 2001 .

[10]  Yu Xue,et al.  A self-adaptive artificial bee colony algorithm based on global best for global optimization , 2017, Soft Computing.

[11]  Yuefei Zhu,et al.  A Deep Learning Approach for Intrusion Detection Using Recurrent Neural Networks , 2017, IEEE Access.

[12]  Guoliang Luo,et al.  Structure Preserving Non-negative Feature Self-Representation for Unsupervised Feature Selection , 2017, IEEE Access.

[13]  Guido Bugmann,et al.  NEURAL NETWORK DESIGN FOR ENGINEERING APPLICATIONS , 2001 .

[14]  Tinghuai Ma,et al.  A Self-adaptive Artificial Bee Colony Algorithm with Symmetry Initialization , 2018 .

[15]  Ohad Shamir,et al.  The Power of Depth for Feedforward Neural Networks , 2015, COLT.

[16]  P. N. Suganthan,et al.  Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization , 2009, IEEE Transactions on Evolutionary Computation.

[17]  K. V. Arya,et al.  An effective gbest-guided gravitational search algorithm for real-parameter optimization and its application in training of feedforward neural networks , 2017, Knowl. Based Syst..

[18]  Bin Xu,et al.  An ensemble algorithm with self-adaptive learning techniques for high-dimensional numerical optimization , 2014, Appl. Math. Comput..

[19]  Paulo Cortez,et al.  Particle swarms for feedforward neural network training , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[20]  F.H.F. Leung,et al.  Tuning of the structure and parameters of neural network using an improved genetic algorithm , 2001, IECON'01. 27th Annual Conference of the IEEE Industrial Electronics Society (Cat. No.37243).

[21]  Yu Wang,et al.  Self-adaptive learning based particle swarm optimization , 2011, Inf. Sci..

[22]  Hossam Faris,et al.  Optimizing Feedforward Neural Networks Using Biogeography Based Optimization for E-Mail Spam Identification , 2016 .

[23]  Yudong Zhang,et al.  Fruit classification by biogeography‐based optimization and feedforward neural network , 2016, Expert Syst. J. Knowl. Eng..

[24]  R. Kothari,et al.  Self-regulation of model order in feedforward neural networks , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).

[25]  Pedro Larrañaga,et al.  A review of feature selection techniques in bioinformatics , 2007, Bioinform..

[26]  Xiaofei Wang,et al.  Artificial Intelligence-Based Techniques for Emerging Heterogeneous Network: State of the Arts, Opportunities, and Challenges , 2015, IEEE Access.

[27]  Menglong Yan,et al.  Position Detection and Direction Prediction for Arbitrary-Oriented Ships via Multitask Rotation Region Convolutional Neural Network , 2018, IEEE Access.

[28]  Zhou Wang,et al.  Enhanced self-adaptive evolutionary algorithm for numerical optimization , 2012 .

[29]  Hui Wang,et al.  Diversity enhanced particle swarm optimization with neighborhood search , 2013, Inf. Sci..

[30]  Yu-Lin He,et al.  Fuzziness based semi-supervised learning approach for intrusion detection system , 2017, Inf. Sci..

[31]  Xizhao Wang,et al.  A review on neural networks with random weights , 2018, Neurocomputing.

[32]  T Kiatcharoenpol,et al.  Prediction of internal surface roughness in drilling using three feedforward neural networks - a comparison , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[33]  Subhash Kak,et al.  Feedback neural networks: new characteristics and a generalization , 1993 .

[34]  Mengjie Zhang,et al.  Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms , 2014, Appl. Soft Comput..

[35]  Yu Xue,et al.  An Evolutionary Computation Based Feature Selection Method for Intrusion Detection , 2018, Secur. Commun. Networks.

[36]  Leif E. Peterson K-nearest neighbor , 2009, Scholarpedia.

[37]  Stephen C. Adams,et al.  Feature Selection for Hidden Markov Models and Hidden Semi-Markov Models , 2016, IEEE Access.

[38]  David B. Fogel,et al.  An introduction to simulated evolutionary optimization , 1994, IEEE Trans. Neural Networks.