Adaptive time-delay neural network for temporal correlation and prediction

Dynamic identification of temporally changing signals is a key issue in real-time signal processing and understanding. Such changing signals may arise from moving objects in visual images, spoken words, target trajectories and other kinds of sensor data in a wide variety of applications. An Adaptive Time-Delay Neural Network (ATNN) is proposed, which dynamically adapts its time-delays as well as its synaptic weights. The resulting network is trained to distinguish the temporal properties and spatiotemporal correlations of various input patterns. In biological systems, the delays along axons or at the synapses may vary, like in the ATNN, due to factors such as the length of the axon, insulation (myelin), or the details of the biochemical processes. In this paper, an improved learning algorithm based on gradient descent is derived, both for adaptive time-delays and synaptic strengths. This adaptation paradigm offers more flexibility for the network to attain the optimal time-delays and to achieve more accurate pattern mapping and classification than is the case of using arbitrary fixed delays, as has been done previously. Noise tolerance was tested on a series of experiments, and it is found that the proposed ATNN shows advantages. Time series prediction was tested with the chaotic Mackey-Glass equation, and the ATNN performed better than training with fixed time delays. The ATNN is suitable for spatiotemporal signal recognition, prediction and classification.