Recursive Bayesian Modelling of Time Series by Neural Networks

The Bayesian interpretation of regularisation is now well established for batch processing of data by neural networks. However, when the data arrives sequentially the most common approach is still to use least-squares based algorithms. Previous work has suggested the use of Kalman filter based algorithms for training neural networks under sequential learning with regularisation. We examine specifically the class of approximation schemes known as general linear models. In this case the Bayesian learning of the network weights with Gaussian approximations leads to a Kalman filter algorithm for the weights. The Kalman filter iteratively learns the probability density of the weights and incorporates online regularisation. We investigate the application of this technique to two time series problems, one an illustrative demonstration problem, the second motivated by an analytical model of slender delta wings.