Neural networks learning in a changing environment

The authors study the learning dynamics of a large class of neural networks for constant learning parameters. A learning algorithm that enables a neural network to adapt to a changing environment must have a non-vanishing learning parameter. This constant adaptability, however, goes at the cost of the accuracy, i.e. the size of the fluctuations in the plasticities, such as synapses and thresholds. The introduction of Poisson-distributed time steps facilitates a continuous time description of learning processes with non-vanishing learning parameters. The authors used this description to study the performance of neural networks operating in a changing environment. Given a well-defined error, an optimal learning parameter can be estimated in some cases.<<ETX>>