Local output gamma feedback neural network

A theory is introduced for a multi-layered local output gamma feedback neural network (LOGF-NN) within the locally recurrent globally feedforward neural networks paradigm. It is developed for the classification and prediction tasks for spatio-temporal systems, and allows the representation of different time scales through the incorporation of a gamma memory. The update equations for the feedforward and temporal weights and parameters are derived through the backpropagation through time (BTT) learning algorithm. As a demonstration, it is applied to the benchmark problem of single-step sunspot series prediction, and is compared to other neural network (weight elimination neural network: WNET) and statistical (linear and threshold autoregressive: TAR) methods. As a measure of prediction accuracy, the average relative variance (ARV) is used. The proposed LOGF-NN approach's performance is comparable to the TAR method and outperforms the linear AR and the WNET approaches.

[1]  Barak A. Pearlmutter Learning state space trajectories in recurrent neural networks : a preliminary report. , 1988 .

[2]  David E. Rumelhart,et al.  Predicting the Future: a Connectionist Approach , 1990, Int. J. Neural Syst..

[3]  Ah Chung Tsoi,et al.  Locally recurrent globally feedforward networks: a critical review of architectures , 1994, IEEE Trans. Neural Networks.

[4]  P J Webros BACKPROPAGATION THROUGH TIME: WHAT IT DOES AND HOW TO DO IT , 1990 .

[5]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[6]  G. Yule On a Method of Investigating Periodicities in Disturbed Series, with Special Reference to Wolfer's Sunspot Numbers , 1927 .

[7]  Alexander H. Waibel,et al.  Modular Construction of Time-Delay Neural Networks for Speech Recognition , 1989, Neural Computation.

[8]  H. Tong,et al.  Threshold Autoregression, Limit Cycles and Cyclical Data , 1980 .

[9]  J J Hopfield,et al.  Neural computation by concentrating information in time. , 1987, Proceedings of the National Academy of Sciences of the United States of America.

[10]  Barak A. Pearlmutter Learning State Space Trajectories in Recurrent Neural Networks , 1989, Neural Computation.

[11]  Yoshua Bengio,et al.  Hierarchical Recurrent Neural Networks for Long-Term Dependencies , 1995, NIPS.

[12]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[13]  Ronald J. Williams,et al.  A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.

[14]  A. Izenman J. R. Wolf and the zürich sunspot relative numbers , 1985 .

[15]  José Carlos Príncipe,et al.  The gamma model--A new neural model for temporal processing , 1992, Neural Networks.

[16]  Tad Hogg,et al.  A Dynamical Approach to Temporal Pattern Processing , 1987, NIPS.

[17]  A. Lapedes,et al.  Nonlinear Signal Processing Using Neural Networks , 1987 .

[18]  Kumpati S. Narendra,et al.  Identification and control of dynamical systems using neural networks , 1990, IEEE Trans. Neural Networks.