Hill climbing in recurrent neural networks for learning the a/sup n/b/sup n/c/sup n/ language

A simple recurrent neural network is trained on a one-step look ahead prediction task for symbol sequences of the context-sensitive a/sup n/b/sup n/c/sup n/ language. Using an evolutionary hill climbing strategy for incremental learning the network learns to predict sequences of strings up to depth n=12. Experiments and the algorithms used are described. The activation of the hidden units of the trained network is displayed in a 3D graph and analysed.