A Global Prediction ( or Learning ) Theory for Some Nonlinear Functional-Differential Equations

1. Introduction. This article surveys some recent global limit and oscillation theorems for some systems of nonlinear difference-differential equations that define cross-correlated flows on probabilistic networks. These equations comprise the first stage of a theory of learning [IJ that attempts to unify, at least qualitatively, some data from psychology, neurophysiology and neuroanatomy by finding common mathematical principles that underly these data. The behavior of these networks can be interpreted as a nonstationary and deterministic prediction theory because the networks learn in a way that imitates the following heuristic example chosen from daily life. An experimenter C teaches a human subject g' a list of events (say the list AB of letters) by presenting the letters one after the other to g' and then presenting the list several times in this way. To test if g' has learned the list, C then presents A alone to g' and hopes that g' will reply with B. If g' does so whenever A is presented, C can safely assume that g' has learned the list. We shall construct machines (the networks) which learn according to a similar procedure. At least three phases exist in this learning paradigm: (i) the learning trial during which the letters are presented , (ii) a remembering period during which no new material is presented, and (iii) a recall trial during which C tests .It's memory by presenting A alone and noting how well .It can reproduce B. We shall find that varying the geometry of our networks can dramatically change the qualitative properties of each of these phases. Moreover, the networks often exhibit a monotonic response to wildly oscillating inputs; some of them become easier to analyze when loops-and thus an extra source of nonlinear interactions-are added to them, some exhibit interactions that can be interpreted as "time reversals" on C's time scale, and some of their stability properties become easier to guarantee as the time .lag increases.