BTT: Back-Propagation Through Time

As explained in Chapt. 14 an extension of the classical back-propagation algorithm can be used to train recurrent neural networks. Not only is it possible to construct networks which possess a set of predetermined stable states (attractors) but one can even impose time-dependent trajectories which are traced out in the state space of the network. The program RECURR implements the algorithm of back-propagation through time (BTT) derived in Sect. 14.3. The network consists of N fully coupled neurons which can take on continuous values in the interval s i = - 1 . . . + 1. Either one or two these neurons represent the output of the network. The goal of the algorithm is to find a set of weights w ij which guarantee that the output signals s i (t), i ∈ Ω, follow some predetermined functions ζ i ;(t) as closely as possible over a time interval 0 ≤ t ≤ τ. This is achieved by minimizing an error functional, cf. (14.21), using the gradient descent method.