Exponential convergence of a gradient descent algorithm for a class of recurrent neural networks

This paper considers the convergence of an approximate gradient descent back propagation algorithm for a one hidden layer neural network whose output is an affine combination of certain nonlinear functions of the outputs of biased infinite impulse response affine systems. We give a persistent excitation condition that guarantees local convergence of the algorithm. We show that this condition holds for generic parameter values whenever one applies generic periodic inputs of period at least N, N being the number of parameters.