Stochastic convergence analysis of the single-layer backpropagation algorithm for noisy input data

The statistical learning behavior of the single-layer backpropagation algorithm was analyzed for a system identification formulation for noise-free training data, transient and steady-state results were obtained for the mean weight behavior, mean-square error (MSE), and probability of correct classification. The article extends these results to the case of noisy training data, three new analytical results are obtained (1) the mean weights converge to finite values, (2) the MSE is bounded away from zero, and (3) the probability of correct classification does not converge to unity. However, over a wide range of signal-to-noise ratio (SNR), the noisy training data does not have a significant effect on the perceptron stationary points relative to the weight fluctuations. Hence, one concludes that noisy training data has a relatively small effect on the ability of the perceptron to learn the underlying weight vector F of the training signal model.