Improving Neural Models of Language with Input-Output Tensor Contexts

Tensor contexts enlarge the performances and computational powers of many neural models of language by generating a double filtering of incoming data. Applied to the linguistic domain, its implementation enables a very efficient disambiguation of polysemous and homonymous words. For the neurocomputational modeling of language, the simultaneous tensor contextualization of inputs and outputs inserts into the models strategic passwords that rout words towards key natural targets, thus allowing for the creation of meaningful phrases. In this work, we present the formal properties of these models and describe possible ways to use contexts to represent plausible neural organizations of sequences of words. We include an illustration of how these contexts generate topographic or thematic organization of data. Finally, we show that double contextualization opens promising ways to explore the neural coding of episodes, one of the most challenging problems of neural computation.

[1]  Leon N. Cooper,et al.  A possible organization of animal memory and learning , 1973 .

[2]  James A. Anderson,et al.  An Introduction To Neural Networks , 1998 .

[3]  Alexander Graham,et al.  Kronecker Products and Matrix Calculus: With Applications , 1981 .

[4]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[5]  Eduardo Mizraji,et al.  Semantic graphs and associative memories. , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[6]  A. Baddeley,et al.  The multi-component model of working memory: Explorations in experimental cognitive psychology , 2006, Neuroscience.

[7]  Serafim Rodrigues,et al.  A modular architecture for transparent computation in Recurrent Neural Networks , 2016, Neural Networks.

[8]  A. Pomi,et al.  A Possible Neural Representation of Mathematical Group Structures , 2016, Bulletin of mathematical biology.

[9]  A. Baddeley Working memory: looking back and looking forward , 2003, Nature Reviews Neuroscience.

[10]  W. Calvin The unitary hypothesis: A common neural circuitry for novel manipulations, language, plan-ahead, and throwing? , 1993 .

[11]  Jack L. Gallant,et al.  A Continuous Semantic Space Describes the Representation of Thousands of Object and Action Categories across the Human Brain , 2012, Neuron.

[12]  G. Ojemann Ojemann's data: Provocative but mysterious , 1983, Behavioral and Brain Sciences.

[13]  H. Eichenbaum Prefrontal–hippocampal interactions in episodic memory , 2017, Nature Reviews Neuroscience.

[14]  E Mizraji,et al.  Context-dependent associations in linear distributed memories. , 1989, Bulletin of mathematical biology.

[15]  E. Tulving Episodic memory: from mind to brain. , 2002, Annual review of psychology.

[16]  Juan C. Valle-Lisboa,et al.  Dynamic searching in the brain , 2009, Cognitive Neurodynamics.

[17]  D. Schacter,et al.  Episodic future thinking: mechanisms and functions , 2017, Current Opinion in Behavioral Sciences.

[18]  Richard L. Lewis,et al.  The mind and brain of short-term memory. , 2008, Annual review of psychology.

[19]  K. Amunts,et al.  Broca's region: from action to language. , 2005, Physiology.

[20]  Teuvo Kohonen,et al.  Associative memory. A system-theoretical approach , 1977 .

[21]  William D. Raymond,et al.  Probabilistic Relations between Words: Evidence from Reduction in Lexical Production , 2008 .

[22]  Teuvo Kohonen,et al.  Correlation Matrix Memories , 1972, IEEE Transactions on Computers.

[23]  R. N. Spreng,et al.  The Future of Memory: Remembering, Imagining, and the Brain , 2012, Neuron.

[24]  D. Kimura 7 – Neuromotor Mechanisms in the Evolution of Human Communication1 , 1979 .

[25]  James A. Anderson,et al.  A simple neural network generating an interactive memory , 1972 .

[26]  P. Niyogi,et al.  Computational and evolutionary aspects of language , 2002, Nature.

[27]  A. Luria The Working Brain , 1973 .

[28]  R. Potthast,et al.  Inverse problems in dynamic cognitive modeling. , 2009, Chaos.

[29]  Eduardo Mizraji,et al.  Modeling spatial–temporal operations with context-dependent associative memories , 2015, Cognitive Neurodynamics.

[30]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[31]  Christopher D. Manning,et al.  Probabilistic models of language processing and acquisition , 2006, Trends in Cognitive Sciences.

[32]  Nicholas B. Turk-Browne,et al.  Complementary learning systems within the hippocampus: A neural network modeling approach to reconciling episodic memory with statistical learning , 2016, bioRxiv.

[33]  Paul Smolensky,et al.  Tensor Product Variable Binding and the Representation of Symbolic Structures in Connectionist Systems , 1990, Artif. Intell..

[34]  W H Calvin,et al.  A stone's throw and its launch window: timing precision and its implications for language and hominid brains. , 1983, Journal of theoretical biology.

[35]  Dan Jurafsky,et al.  Probabilistic Modeling in Psycholinguistics: Linguistic Comprehension and Production , 2006 .