Two-Input Power-Activation Neural Network Weights-Direct-Determination and Structure Optimized by Particle Swarm optimization

Artificial network, with back-propagation (BP) training algorithms, has shown inherent weaknesses in determining weights and optimal network structure. In order to overcome those weaknesses, in this paper, a two-input single-output (TISO) power-activation feed-forward neural network model is investigated and constructed, based on theory of multivariate approximation and power series expansion. Then, the weights-direct-determination (WDD) method for our feed-forward network is studied and proposed. Further, in order to optimize the structure of network, particle swarm optimization (PSO) is applied to search for global optimal solution to the use of neurons in hidden layer. Then, these lead us finally to propose a weights-structure-determination (WSD) methodology by combining WDD and PSO novelly. Finally, computer numerical experiments based on various objective functions substantiate the superiority of our network in terms of approximation and denoising. Further comparison experiments with stochastic gradient-descent (SGD) and previous work substantiate the efficacy of the WSD methodology proposed.

[1]  Russell C. Eberhart,et al.  Guest Editorial Special Issue on Particle Swarm Optimization , 2004, IEEE Trans. Evol. Comput..

[2]  Wei Li,et al.  A weights-directly-determined simple neural network for nonlinear system identification , 2008, 2008 IEEE International Conference on Fuzzy Systems (IEEE World Congress on Computational Intelligence).

[3]  Jun Wang,et al.  Recurrent neural networks for nonlinear output regulation , 2001, Autom..

[4]  Yamilet Quintana,et al.  A survey on the Weierstrass approximation theorem , 2006, math/0611038.

[5]  Léon Bottou,et al.  Large-Scale Machine Learning with Stochastic Gradient Descent , 2010, COMPSTAT.

[6]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[7]  Jaroslaw Sobieszczanski-Sobieski,et al.  Multidisciplinary optimization of a transport aircraft wing using particle swarm optimization , 2002 .

[8]  Harold P. Boas,et al.  Bohr’s power series theorem in several variables , 1996 .

[9]  Olivier Grunder,et al.  Multi-step ahead electricity price forecasting using a hybrid model based on two-layer decomposition technique and BP neural network optimized by firefly algorithm , 2017 .

[10]  Teresa Bernarda Ludermir,et al.  Frankenstein PSO applied to neural network weights and architectures , 2011, 2011 IEEE Congress of Evolutionary Computation (CEC).

[11]  Tony R. Martinez,et al.  The need for small learning rates on large problems , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[12]  Gilles Pagès,et al.  Approximations of Functions by a Multilayer Perceptron: a New Approach , 1997, Neural Networks.

[13]  Teresa Bernarda Ludermir,et al.  An Optimization Methodology for Neural Network Weights and Architectures , 2006, IEEE Transactions on Neural Networks.

[14]  Andries Petrus Engelbrecht,et al.  Saturation in PSO neural network training: Good or evil? , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[15]  Iosif Petrakis A Direct Constructive Proof of a Stone-Weierstrass Theorem for Metric Spaces , 2016, CiE.

[16]  Wei Li,et al.  Growing Algorithm of Laguerre Orthogonal Basis Neural Network with Weights Directly Determined , 2008, ICIC.

[17]  Le Zhang,et al.  A survey of randomized algorithms for training neural networks , 2016, Inf. Sci..

[18]  Yunong Zhang,et al.  Euler Neural Network with Its Weight-Direct-Determination and Structure-Automatic-Determination Algorithms , 2009, 2009 Ninth International Conference on Hybrid Intelligent Systems.

[19]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[20]  Neil E. Cotter,et al.  The Stone-Weierstrass theorem and its application to neural networks , 1990, IEEE Trans. Neural Networks.

[21]  Andries Petrus Engelbrecht,et al.  Overfitting by PSO trained feedforward neural networks , 2010, IEEE Congress on Evolutionary Computation.