Teaching Recurrent Neural Networks to Modify Chaotic Memories by Example

The ability to store and manipulate information is a hallmark of computational systems. Whereas computers are carefully engineered to represent and perform mathematical operations on structured data, neurobiological systems perform analogous functions despite flexible organization and unstructured sensory input. Recent efforts have made progress in modeling the representation and recall of information in neural systems. However, precisely how neural systems learn to modify these representations remains far from understood. Here we demonstrate that a recurrent neural network (RNN) can learn to modify its representation of complex information using only examples, and we explain the associated learning mechanism with new theory. Specifically, we drive an RNN with examples of translated, linearly transformed, or pre-bifurcated time series from a chaotic Lorenz system, alongside an additional control signal that changes value for each example. By training the network to replicate the Lorenz inputs, it learns to autonomously evolve about a Lorenz-shaped manifold. Additionally, it learns to continuously interpolate and extrapolate the translation, transformation, and bifurcation of this representation far beyond the training data by changing the control signal. Finally, we provide a mechanism for how these computations are learned, and demonstrate that a single network can simultaneously learn multiple computations. Together, our results provide a simple but powerful mechanism by which an RNN can learn to manipulate internal representations of complex information, allowing for the principled study and precise design of RNNs.

[1]  Julia M. Carroll Letter knowledge precipitates phoneme segmentation, but not phoneme invariance , 2004 .

[2]  James R. Kubricht,et al.  Intuitive Physics: Current Research and Controversies , 2017, Trends in Cognitive Sciences.

[3]  Yan Wang,et al.  A novel memristive Hopfield neural network with application in associative memory , 2017, Neurocomputing.

[4]  C. Barry,et al.  Specific evidence of low-dimensional continuous attractor dynamics in grid cells , 2013, Nature Neuroscience.

[5]  John von Neumann,et al.  First draft of a report on the EDVAC , 1993, IEEE Annals of the History of Computing.

[6]  R. Wong,et al.  Diverse Strategies Engaged in Establishing Stereotypic Wiring Patterns among Neurons Sharing a Common Input at the Visual System's First Synapse , 2012, The Journal of Neuroscience.

[7]  Brad E. Pfeiffer,et al.  Hippocampal place cell sequences depict future paths to remembered goals , 2013, Nature.

[8]  Michale S Fee,et al.  The songbird as a model for the generation and learning of complex sequential behaviors. , 2010, ILAR journal.

[9]  Ying-Cheng Lai,et al.  Controlling chaos , 1994 .

[10]  Benjamin Schrauwen,et al.  Reservoir Computing Trends , 2012, KI - Künstliche Intelligenz.

[11]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[12]  Zhi Zhang,et al.  Developmental maturation of excitation and inhibition balance in principal neurons across four layers of somatosensory cortex , 2011, Neuroscience.

[13]  Summer K. Rankin,et al.  Neural Substrates of Interactive Musical Improvisation: An fMRI Study of ‘Trading Fours’ in Jazz , 2014, PloS one.

[14]  Francesco Zappa Nardelli,et al.  The semantics of power and ARM multiprocessor machine code , 2009, DAMP '09.

[15]  Jinkyu Lee,et al.  High-level feature representation using recurrent neural network for speech emotion recognition , 2015, INTERSPEECH.

[16]  Sergio Gomez Colmenarejo,et al.  Hybrid computing using a neural network with dynamic external memory , 2016, Nature.

[17]  J. Cowan,et al.  Excitatory and inhibitory interactions in localized populations of model neurons. , 1972, Biophysical journal.

[18]  Roksana Słowik Inverses and Determinants of Toeplitz-Hessenberg Matrices , 2018, Taiwanese Journal of Mathematics.

[19]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[20]  Tara N. Sainath,et al.  Deep Convolutional Neural Networks for Large-scale Speech Tasks , 2015, Neural Networks.

[21]  M. Hegarty Mechanical reasoning by mental simulation , 2004, Trends in Cognitive Sciences.

[22]  H. Sebastian Seung,et al.  Learning Continuous Attractors in Recurrent Networks , 1997, NIPS.

[23]  Emilio Kropff,et al.  Place cells, grid cells, and the brain's spatial representation system. , 2008, Annual review of neuroscience.

[24]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[25]  F. Craik,et al.  Cognition through the lifespan: mechanisms of change , 2006, Trends in Cognitive Sciences.

[26]  Solaiman Shokur,et al.  A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys , 2013, Science Translational Medicine.

[27]  Yoram Burakyy,et al.  Accurate Path Integration in Continuous Attractor Network Models of Grid Cells , 2009 .

[28]  Si Wu,et al.  Continuous Attractor Neural Networks: Candidate of a Canonical Model for Neural Information Representation , 2016, F1000Research.

[29]  Devika Narain,et al.  Flexible timing by temporal scaling of cortical responses , 2017, Nature Neuroscience.

[30]  Ju Young Kim,et al.  Development of hippocampal mossy fiber synaptic outputs by new neurons in the adult brain , 2008, Proceedings of the National Academy of Sciences.

[31]  Travis A. Jarrell,et al.  The Connectome of a Decision-Making Neural Network , 2012, Science.

[32]  Junfei Qiao,et al.  Growing Echo-State Network With Multiple Subreservoirs , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[33]  Andrea Tacchetti,et al.  Invariant Recognition Shapes Neural Representations of Visual Input. , 2018, Annual review of vision science.

[34]  L. Tsimring,et al.  Generalized synchronization of chaos in directionally coupled chaotic systems. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[35]  J. Gold,et al.  The neural basis of decision making. , 2007, Annual review of neuroscience.

[36]  J. Nathan Kutz,et al.  Estimating Memory Deterioration Rates Following Neurodegeneration and Traumatic Brain Injuries in a Hopfield Network Model , 2017, Front. Neurosci..