Attractive Periodic Sets in Discrete-Time Recurrent Networks (with Emphasis on Fixed-Point Stability and Bifurcations in Two-Neuron Networks)

We perform a detailed fixed-point analysis of two-unit recurrent neural networks with sigmoid-shaped transfer functions. Using geometrical arguments in the space of transfer function derivatives, we partition the network state-space into distinct regions corresponding to stability types of the fixed points. Unlike in the previous studies, we do not assume any special form of connectivity pattern between the neurons, and all free parameters are allowed to vary. We also prove that when both neurons have excitatory self-connections and the mutual interaction pattern is the same (i.e., the neurons mutually inhibit or excite themselves), new attractive fixed points are created through the saddle-node bifurcation. Finally, for an N-neuron recurrent network, we give lower bounds on the rate of convergence of attractive periodic points toward the saturation values of neuron activations, as the absolute values of connection weights grow.

[1]  Xin Wang,et al.  Period-doublings to chaos in a simple neural network , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[2]  Fu-Sheng Tsung Relaxing the Symmetric Weight Condition for Convergent Dynamics in Discrete-Time Recurrent Networks , 1995 .

[3]  Khashayar Pakdaman,et al.  Effect of delay on the boundary of the basin of attraction in a system of two neurons , 1998, Neural Networks.

[4]  Daniel J. Amit,et al.  Modeling brain function: the world of attractor neural networks, 1st Edition , 1989 .

[5]  Paul Rodríguez,et al.  A Recurrent Neural Network that Learns to Count , 1999, Connect. Sci..

[6]  Kenji Doya,et al.  Near-Saddle-Node Bifurcation Behavior as Dynamics in Working Memory for Goal-Directed Behavior , 1998, Neural Computation.

[7]  Morris W. Hirsch,et al.  Saturation at high gain in discrete time recurrent networks , 1994, Neural Networks.

[8]  C. Lee Giles,et al.  Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks , 1992, Neural Computation.

[9]  Fernanda Botelho,et al.  Dynamical features simulated by recurrent neural networks , 1999, Neural Networks.

[10]  Stefen Hui,et al.  Dynamical analysis of the brain-state-in-a-box (BSB) neural models , 1992, IEEE Trans. Neural Networks.

[11]  Raymond L. Watrous,et al.  Induction of Finite-State Languages Using Second-Order Recurrent Networks , 1992, Neural Computation.

[12]  Jun Wang,et al.  A Recurrent Neural Network for Real-Time , 1999 .

[13]  Panagiotis Manolios,et al.  First-Order Recurrent Neural Networks and Deterministic Finite State Automata , 1994, Neural Computation.

[14]  James A. Anderson The BSB model: a simple nonlinear autoassociative neural network , 1993 .

[15]  Padhraic Smyth,et al.  Self-clustering recurrent networks , 1993, IEEE International Conference on Neural Networks.

[16]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[17]  James L. McClelland,et al.  Finite State Automata and Simple Recurrent Networks , 1989, Neural Computation.

[18]  Sylvain Meignen,et al.  Synchronization and desynchronization of neural oscillators , 1999, Neural Networks.

[19]  Peter Tiño,et al.  Architectural Bias in Recurrent Neural Networks: Fractal Analysis , 2002, Neural Computation.

[20]  Michael C. Mozer,et al.  A Unified Gradient-Descent/Clustering Architecture for Finite State Machine Induction , 1993, NIPS.

[21]  H. Anton Calculus with analytic geometry , 1992 .

[22]  Arndt Klotz,et al.  A small-size neural network for computing with strange attractors , 1999, Neural Networks.

[23]  Peter Tiño,et al.  Learning and Extracting Initial Mealy Automata with a Modular Neural Network Model , 1995, Neural Comput..

[24]  Randall D. Beer,et al.  On the Dynamics of Small Continuous-Time Recurrent Neural Networks , 1995, Adapt. Behav..

[25]  Frank Pasemann,et al.  Characterization of periodic attractors in neural ring networks , 1995, Neural Networks.

[26]  Fation Sevrani,et al.  On the Synthesis of Brain-State-in-a-Box Neural Models with Application to Associative Memory , 2000, Neural Computation.

[27]  Liang Jin,et al.  Absolute stability conditions for discrete-time recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[28]  Mike Casey,et al.  The Dynamics of Discrete-Time Computation, with Application to Recurrent Neural Networks and Finite State Machine Extraction , 1996, Neural Computation.

[29]  Xin Wang,et al.  Stability of fixed points and periodic orbits and bifurcations in analog neural networks , 1992, Neural Networks.

[30]  Mathukumalli Vidyasagar,et al.  Location and stability of the high-gain equilibria of nonlinear neural networks , 1993, IEEE Trans. Neural Networks.

[31]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[32]  J. Devin McAuley,et al.  Analysis of the Effects of Noise on a Model for the Neural Mechanism of Short-Term Active Memory , 1994, Neural Computation.

[33]  F. Pasemann Discrete dynamics of two neuron networks , 1993 .

[34]  Peter Tiňo,et al.  Finite State Machines and Recurrent Neural Networks -- Automata and Dynamical Systems Approaches , 1995 .