Delayed Learning and the Organized States

Elman presented a network with a context layer for time-series processing. The context layer keeps the output of the hidden layer, and the outputs are inputted to the hidden layer for the next calculation of the time-series. In this paper, the context layer is reformed into the internal memory layer, which is connected from the hidden layer with the connection weights to make the internal memory. The new learning algorithm, called the time-delayed back-propagation learning, is developed for the internal memory. The ability of the network with the internal memory layer and the organized states of the internal memory are demonstrated by applying it to the simple sinusoidal time-series.