Learning Recurrent Dynamics using Differential Evolution

This paper presents an efficient and powerful approach for learning dynamics with Recurrent Neural Networks (RNNs). No specialized or fine-tuned RNNs are used but rather standard RNNs with one fully connected hidden layer. The training procedure is based on a variant of Differential Evolution (DE) with a modified mutation schemey that allows to reduce the population size in our setup down to five, but still yields very good results even within a few generations. For several common Multiple Superimposed Oscillator (MSO) instances new state-of-the-art results are presented, which are across the board multiple magnitudes better than the achieved results published so far. Furthermore, for new and even more difficult instances, i.e., MSO9–MSO12, our setup achieves lower error rates than reported previously for the best system on MSO5–MSO8.