Recurrent neural networks and time series prediction

This work presents an exploration of the dynamic behavior of small recurrent networks. Proofs are presented to show that fixed points and limit cycles can be produced by small recurrent networks. As an application, a network trained to a noisy signal for which the clean signal was a sine wave produced good signal recognition for signal-to-noise ratios as low as 0.167. Networks have also been designed which generate chaotic output sequences. These networks provide models for behaviors observed in biological systems. As an example, a network was designed to generate the behaviors observed in the olfactory bulbs of rabbits. These include the ability to switch between chaos and limit cycles and between limit cycles by changing external inputs. The impact of changing the number of nodes, amount of refractoriness, and network topology on network behavior is also studied. Any of these can change the character of the network output. On-line and off-line training algorithms for these networks are also presented. Existing models are analyzed to find methods for reducing training times. For example, reductions in total training time of 46% have been observed with the addition of weight projections. Modifications also improved the prediction accuracy of the networks. One such change was to replace the standard Euler numerical integration with a fourth-order Runge-Kutta routine. The addition of teacher forcing and weight elimination was also studied. Both improved performance with a negligible impact on training time. New weight update equations were derived which allow weighted inputs to be included into existing networks. This frequently improved performance, but does introduce a speed penalty. The performance of recurrent networks when applied to the problem of time series prediction was evaluated. The methods used for these comparisons were: classical mathematical analysis, back-propagation, a recurrent network with an on-line training algorithm and a recurrent network with an off-line training algorithm. The prediction accuracy of the recurrent networks was comparable to, or better, than that of back-propagation and the ARMA models. However, a seasonal means model for the temperature data was superior to any of the networks.