On the Effect of Neuron Activation Gain on Robustness of Complete Stability

The paper addresses robustness of complete stability with respect to perturbations of the interconnections of nominal symmetric neural networks. The inuence of the maximum neuron activation gain on complete stability robustness is discussed for a class of third-order neural networks. It is shown that high values of the gain lead to an extremely small complete stability margin of all nominal symmetric neural networks, thus allowing to conclude that complete stability robustness cannot be, in general, guaranteed.