WASD Algorithm with Pruning-While-Growing and Twice-Pruning Techniques for Multi-Input Euler Polynomial Neural Network

Differing from the conventional back-propagation (BP) neural networks, a novel multi-input Euler polynomial neural network, in short, MIEPNN (specifically, 4-input Euler polynomial neural network, 4IEPNN) is established and investigated in this paper. In order to achieve satisfactory performance of the established MIEPNN, a weights and structure determination (WASD) algorithm with pruning-while-growing (PWG) and twice-pruning (TP) techniques is built up for the established MIEPNN. By employing the weights direct determination (WDD) method, the WASD algorithm not only determines the optimal connecting weights between hidden layer and output layer directly, but also obtains the optimal number of hidden-layer neurons. Specifically, a sub-optimal structure is obtained via the PWG technique, then the redundant hidden-layer neurons are further pruned via the TP technique. Consequently, the optimal structure of the MIEPNN is obtained. To provide a reasonable choice in practice, several different MATLAB computing routines related to the WDD method are studied. Comparative numerical-experiment results of the 4IEPNN using these different MATLAB computing routines and the standard multi-layer perceptron (MLP) neural network further verify the superior performance and efficacy of the proposed MIEPNN equipped with the WASD algorithm including PWG and TP techniques in terms of training, testing and predicting.

[1]  Lu Chen,et al.  Hardware implementation of an on-chip BP learning neural network with programmable neuron characteristics and learning rate adaptation , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[2]  Wu Zhong-hua WEIERSTRASS APPROXIMATION THEOREM IN THE PROMOTION OF MULTI-FUNCTION , 2009 .

[3]  CHUN-CHENG PENG,et al.  Advanced Adaptive Nonmonotone Conjugate Gradient Training Algorithm for Recurrent Neural Networks , 2008, Int. J. Artif. Intell. Tools.

[4]  Xin Yao,et al.  A new evolutionary system for evolving artificial neural networks , 1997, IEEE Trans. Neural Networks.

[5]  Yunong Zhang,et al.  Euler Neural Network with Its Weight-Direct-Determination and Structure-Automatic-Determination Algorithms , 2009, 2009 Ninth International Conference on Hybrid Intelligent Systems.

[6]  Lorenzo D'Ambrosio Extension of Bernstein polynomials to infinite dimensional case , 2006, J. Approx. Theory.

[7]  Dave Sept Scientific Computing: An Introductory Survey. By Michael T. Health. WCB/McGraw-Hill, 1997. 448 pp. Softcover, ISBN 0–07–027684–6. (A Solutions Manual is also available.) , 2000 .

[8]  Wei Li,et al.  A weights-directly-determined simple neural network for nonlinear system identification , 2008, 2008 IEEE International Conference on Fuzzy Systems (IEEE World Congress on Computational Intelligence).

[9]  Shen Zhang,et al.  Improved BP Neural Network for Transformer Fault Diagnosis , 2007 .

[10]  Cui Tao,et al.  Speaker-independent speech recognition based on fast neural network , 2003, Int. J. Artif. Intell. Tools.

[11]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[12]  Arnaud Lallouet,et al.  Building Consistencies for Partially Defined Constraints with Decision Trees and Neural Networks , 2007, Int. J. Artif. Intell. Tools.

[13]  Stephen J. Chapman,et al.  MATLAB Programming for Engineers , 1999 .

[14]  Manu Pratap Singh,et al.  Analysis of Multidimensional Xor Classification Problem with Evolutionary Feedforward Neural Networks , 2007, Int. J. Artif. Intell. Tools.

[15]  Carl D. Meyer,et al.  Matrix Analysis and Applied Linear Algebra , 2000 .