A Polynomial Neural network with Controllable Precision and Human-Readable Topology II: Accelerated Approach Based on Expanded Layer

How about converting Taylor series to a network to solve the black-box nature of Neural Networks? Controllable and readable polynomial neural network (Gang transform or CR-PNN) is the Taylor expansion in the form of network, which is about ten times more efficient than typical BPNN for forward-propagation. Additionally, we can control the approximation precision and explain the internal structure of the network; thus, it is used for prediction and system identification. However, as the network depth increases, the computational complexity increases. Here, we presented an accelerated method based on an expanded order to optimize CR-PNN. The running speed of the structure of CR-PNN II is significantly higher than CR-PNN I under preserving the properties of CR-PNN I.

[1]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[2]  T. Poggio,et al.  On optimal nonlinear associative recall , 1975, Biological Cybernetics.

[3]  Yong Wang,et al.  A generalized-constraint neural network model: Associating partially known relationships for nonlinear regressions , 2009, Inf. Sci..

[4]  Gang Liu,et al.  A Relation Spectrum Inheriting Taylor Series: Muscle Synergy and Coupling for Hand , 2020 .

[5]  De-Shuang Huang,et al.  Modified constrained learning algorithms incorporating additional functional constraints into neural networks , 2008, Inf. Sci..

[6]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[7]  Feng Yu,et al.  A short-term load forecasting model of natural gas based on optimized genetic algorithm and improved BP neural network , 2014 .

[8]  LiMin Fu,et al.  Rule Generation from Neural Networks , 1994, IEEE Trans. Syst. Man Cybern. Syst..

[9]  Gang Liu,et al.  Dendrite Net: A White-Box Module for Classification, Regression, and System Identification , 2020, IEEE Transactions on Cybernetics.

[10]  C. Lee Giles,et al.  Rule Revision With Recurrent Neural Networks , 1996, IEEE Trans. Knowl. Data Eng..

[11]  Gang Liu,et al.  A Polynomial Neural Network with Controllable Precision and Human-Readable Topology for Prediction and System Identification , 2020, ArXiv.

[12]  Zhi-Yong Ran,et al.  Parameter Identifiability in Statistical Machine Learning: A Review , 2017, Neural Computation.

[13]  Joachim Diederich,et al.  The truth will come to light: directions and challenges in extracting the knowledge embedded within trained artificial neural networks , 1998, IEEE Trans. Neural Networks.

[14]  Alexander H. Waibel,et al.  Modular Construction of Time-Delay Neural Networks for Speech Recognition , 1989, Neural Computation.

[15]  Xi Cheng,et al.  Polynomial Regression As an Alternative to Neural Nets , 2018, ArXiv.

[16]  Jing Wang,et al.  A Novel System Analysis Methodology: Transform Method, Relation Spectrum, and System Filter , 2020 .

[17]  Rihard Karba,et al.  Incorporating prior knowledge into artificial neural networks - an industrial case study , 2004, Neurocomputing.