Noise robustness in multilayer neural networks

The training of multilayered neural networks in the presence of different types of noise is studied. We consider the learning of realizable rules in nonoverlapping architectures. Achieving optimal generalization depends on the knowledge of the noise level, however its misestimation may lead to partial or complete loss of the generalization ability. We demonstrate this effect in the framework of online learning and present the results in terms of noise robustness phase diagrams. While for additive (weight) noise the robustness properties depend on the architecture and size of the networks, this is not so for multiplicative (output) noise. In this case we find a universal behaviour independent of the machine size for both the tree parity and committee machines.