This paper presents simple conditions ensuring that dynamical neural networks are incrementally stable, that is Lipschitz continuous, on C,. A first interest of this result is that it ensures obviously the continuity of the system as an operator from a signal space to another signal space. This property may be interpreted in this context as the ability for dynamical neural networks to interpolate. In some sense, it is an extension of a wellknown property of static neural networks. A second interest of this result is linked to the fact that the behaviors of Lipschitz continuous systems with respect to specific inputs or initial condition problems can be completely analyzed. Indeed, Lipschitz continuous systems have the steady-state property with respect to any inputs belonging to C: with p E [l,~], i.e., their asymptotic behavior is uniquely determined by the asymptotic behavior of the input. Moreover, the Lipschitz continuity guarantees the existence of globally asymptotic stable (in sense of Lyapunov) equilibrium points for a31 constant inputs.
[1]
Jan C. Willems,et al.
The Analysis of Feedback Systems
,
1971
.
[2]
C. A. Desoer,et al.
Nonlinear Systems Analysis
,
1978
.
[3]
Michael G. Safonov,et al.
Stability and Robustness of Multivariable Feedback Systems
,
1980
.
[4]
Eduardo D. Sontag,et al.
For neural networks, function determines form
,
1992,
[1992] Proceedings of the 31st IEEE Conference on Decision and Control.
[5]
Vincent Fromion,et al.
Robustness and stability of LPV plants through frozen systems analysis
,
1996
.
[6]
B. Pasik-Duncan,et al.
Adaptive Control
,
1996,
IEEE Control Systems.
[7]
S. Monaco,et al.
Asymptotic properties of incrementally stable systems
,
1996,
IEEE Trans. Autom. Control..