Neural networks with dynamic synapses for time-series prediction
暂无分享,去创建一个
Feedforward neural networks commonly utilize scalar synapses. Dynamic neural networks represent a generalization of feedforward neural networks in which scalar synapses are replaced by linear filters. This provides dynamic neural networks with the capability of realizing time-dependent dynamic mappings, making them suitable for time series prediction, nonlinear dynamic system identification, and signal processing applications. This dissertation considers scalar, Finite Impulse Response (scFIR), Infinite Impulse Response (scIIR), lattice, and gamma synapses. As introduced in this dissertation, such synapses may be represented as time delay operators. This formulation has the advantage that any number of filter types may be used in the same network.
The main focus of this work is the development of efficient algorithms to facilitate the application of general dynamic networks to complex problems. Training dynamic neural networks frequently requires repeated computation of error gradients. Methods for both the exact computation and efficient approximation of the error gradient are presented. Second order derivative information is also utilized by some training algorithms and various network analysis techniques. Algorithms for the calculation of second error derivatives are also developed and compared. The trade-off between computational complexity and accuracy of the proposed algorithms is discussed. The performance of these computational methods is verified through application to the selection and training of dynamic neural network architectures for the prediction of several benchmark time series. The developed algorithms enable use of two architecture selection methods, namely Optimal Brain Surgeon and Structural Network Construction. Both are shown to improve the generalization capabilities of dynamic neural networks.