Neural Network Smoothing in Correlated Time Series Context

We present in this paper a neural network (NN) smoothing architecture for non-parametric estimation of the trend of a time series, observed at constant regular time intervals. The NN-smoother computes the trend in the state domain and minimizes a cost function with a regularization term. The regularization term is penalized by a parameter lambda which forces the learning procedure to smooth in the time domain. We define a selecting criterion in order to select the best parameter lambda. We prove that this criterion is an unbiased approximation of the mean squared averaged error when the noisy component of the time series is zero-mean, auto-correlated, stationary process with the auto-covariance coefficients equal to zero after a certain known order.