Towards Emergent Strong Systematicity in a Simple Dynamical Connectionist Network

One of the most striking features of human language is its systematicity. This paper focuses on a fundamental, higher-order systematicity described by word classes. We investigate if such higher-order systematicity can be learned from symbol-based examples alone using a rather simple Jordan-type recurrent neural network (RNN). The network was allowed to keep a vector, which represents a suitable starting state for the production of a particular dynamic. With this representation, the network was able to reproduce particular sentences in a deterministic manner. We also show that the trained network can acquire representations of word classes to a certain extent. The analysis reveals that the higher-order grammatical constraints are realized by means of self-organized sub-symbolic dynamics. We conclude that RNNs can learn higher-order systematicity in language. However, to increase convergence robustness, additional semantic, behaviorallygrounded sources of information should be incorporated to bootstrap systematicity from semantics.

[1]  Geoffrey E. Hinton Tensor Product Variable Binding and the Representation of Symbolic Structures in Connectionist Systems , 1991 .

[2]  Peter Ford Dominey From Holophrases to Abstract Grammatical Constructions : Insights from Simulation Studies , 2004 .

[3]  Stefan L. Frank Strong Systematicity in Sentence Processing by an Echo State Network , 2006, ICANN.

[4]  Jordan B. Pollack,et al.  Recursive Distributed Representations , 1990, Artif. Intell..

[5]  J. Tani,et al.  A sub-symbolic process underlying the usage-based acquisition of a compositional representation: Results of robotic learning experiments of goal-directed actions , 2008, 2008 7th IEEE International Conference on Development and Learning.

[6]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.

[7]  Steven Phillips,et al.  Strong Systematicity within Connectionism: The Tensor-Recurrent Network , 2019, Proceedings of the Sixteenth Annual Conference of the Cognitive Science Society.

[8]  Jun Tani,et al.  Learning to Generate Combinatorial Action Sequences Utilizing the Initial Sensitivity of Deterministic Dynamical Systems , 2003, IWANN.

[9]  Steven Phillips,et al.  Constituent similarity and systematicity: The limits of first-order connectionism , 2000, Connect. Sci..

[10]  Robert F. Hadley Systematicity revisited : reply to Christiansen and Chater and Niklasson and van Gelder , 1994 .

[11]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[12]  Michael I. Jordan,et al.  Forward Models: Supervised Learning with a Distal Teacher , 1992, Cogn. Sci..

[13]  James L. McClelland,et al.  Learning and Applying Contextual Constraints in Sentence Comprehension , 1990, Artif. Intell..