SATURATED OUTPUTS FOR HIGH-GAIN, SELF-EXCITING NETWORKS
暂无分享,去创建一个
We consider a broad class of continuous time dynamical systems modeling a collection of processing units sending signals to each other. Each unit has an internal state variable xi and an output variable yi which is a nondecreasing function gi (xi). Certain outputs, called "forced", are of the form σj (Kxj) where σj is a sigmoid and K > 0 is a parameter called "gain". The dynamics is given by a system of differential equations of the form dx/dt = H (x, y, t). The system is self-exciting: ∂Hi/∂yi ≥ 0, and > 0 for the forced outputs. We show that for sufficiently high gain, the forced outputs are close to the asymptotic limiting values of the sigmoids along any stable solution x (t) defined on a finite interval J, for a proportion of t ∈ J that approaches 1 as K → ∞. This generalizes Hopfield's Saturation Theorem about additive neural networks with symmetric weight matrices.