Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task

In this paper we probe the interaction between sequential and hierarchical learning by investigating implicit learning in a group of school-aged children. We administered a serial reaction time task, in the form of a modified Simon Task in which the stimuli were organised following the rules of two distinct artificial grammars, specifically Lindenmayer systems: the Fibonacci grammar (Fib) and the Skip grammar (a modification of the former). The choice of grammars is determined by the goal of this study, which is to investigate how sensitivity to structure emerges in the course of exposure to an input whose surface transitional properties (by hypothesis) bootstrap structure. The studies conducted to date have been mainly designed to investigate low-level superficial regularities, learnable in purely statistical terms, whereas hierarchical learning has not been effectively investigated yet. The possibility to directly pinpoint the interplay between sequential and hierarchical learning is instead at the core of our study: we presented children with two grammars, Fib and Skip, which share the same transitional regularities, thus providing identical opportunities for sequential learning, while crucially differing in their hierarchical structure. More particularly, there are specific points in the sequence (k-points), which, despite giving rise to the same transitional regularities in the two grammars, support hierarchical reconstruction in Fib but not in Skip. In our protocol, children were simply asked to perform a traditional Simon Task, and they were completely unaware of the real purposes of the task. Results indicate that sequential learning occurred in both grammars, as shown by the decrease in reaction times throughout the task, while differences were found in the sensitivity to k-points: these, we contend, play a role in hierarchical reconstruction in Fib, whereas they are devoid of structural significance in Skip. More particularly, we found that children were faster in correspondence to k-points in sequences produced by Fib, thus providing an entirely new kind of evidence for the hypothesis that implicit learning involves an early activation of strategies of hierarchical reconstruction, based on a straightforward interplay with the statistically-based computation of transitional regularities on the sequences of symbols.

[1]  Dan Klein,et al.  Distributional phrase structure induction , 2001, CoNLL.

[2]  A. Nordström,et al.  What's next for WHO? , 2006, The Lancet.

[3]  Z. Dienes,et al.  UNCONSCIOUS KNOWLEDGE OF ARTIFICIAL GRAMMARS IS APPLIED STRATEGICALLY , 1995 .

[4]  David Poeppel,et al.  Towards a computational(ist) neurobiology of language: correlational, integrated and explanatory neurolinguistics , 2015, Language, cognition and neuroscience.

[5]  D. Delfitto,et al.  Implicit Learning, Bilingualism, and Dyslexia: Insights From a Study Assessing AGL With a Modified Simon Task , 2019, Front. Psychol..

[6]  Angela D. Friederici,et al.  Artificial grammar learning meets formal language theory: an overview , 2012, Philosophical Transactions of the Royal Society B: Biological Sciences.

[7]  Pienie Zwitserlood,et al.  Syntactic structure and artificial grammar learning: The learnability of embedded hierarchical structures , 2008, Cognition.

[8]  Jonathan Harrington,et al.  Phonemic Segmentation and Labelling using the MAUS Technique , 2011 .

[9]  Morten H. Christiansen,et al.  On-Line Individual Differences in Statistical Learning Predict Language Processing , 2010, Front. Psychology.

[10]  Noam Chomsky,et al.  The faculty of language: what is it, who has it, and how did it evolve? , 2002, Science.

[11]  Martin Rohrmeier,et al.  Implicit Learning of Recursive Context-Free Grammars , 2012, PloS one.

[12]  R N Aslin,et al.  Statistical Learning by 8-Month-Old Infants , 1996, Science.

[13]  Peter M. Vishton,et al.  Rule learning by seven-month-old infants. , 1999, Science.

[14]  Gerald Gazdar,et al.  Unbounded Dependencies and Coordinate Structure , 1981 .

[15]  Charles D. Yang,et al.  Modeling Hierarchical Syntactic Structures in Morphological Processing , 2019, Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics.

[16]  Diego Gabriel Krivochen,et al.  A model for a Lindenmayer reconstruction algorithm , 2019, ArXiv.

[17]  James L. McClelland,et al.  Learning the structure of event sequences. , 1991, Journal of experimental psychology. General.

[18]  David J. Lobina Recursion and the competence/performance distinction in AGL tasks , 2011 .

[19]  E. Pothos Theories of artificial grammar learning. , 2007, Psychological bulletin.

[20]  S. Frank,et al.  Insensitivity of the Human Sentence-Processing System to Hierarchical Structure , 2011, Psychological science.

[21]  Eytan Ruppin,et al.  Unsupervised learning of natural languages , 2006 .

[22]  Grzegorz Rozenberg,et al.  The mathematical theory of L systems , 1980 .

[23]  David P. Medeiros Optimal Growth in Phrase Structure , 2008 .

[24]  Morten H. Christiansen,et al.  Similar neural correlates for language and sequential learning: Evidence from event-related brain potentials , 2012, Language and cognitive processes.

[25]  Stuart M. Shieber,et al.  Evidence against the context-freeness of natural language , 1985 .

[26]  M. Nissen,et al.  Attentional requirements of learning: Evidence from performance measures , 1987, Cognitive Psychology.

[27]  Massimo Piattelli-Palmarini,et al.  Still a bridge too far? Biolinguistic questions for grounding language on brains , 2008 .

[28]  S. Kanno,et al.  Spell-Out and the Minimalist Program , 2013 .

[29]  Sheila A. Greibach,et al.  A New Normal-Form Theorem for Context-Free Phrase Structure Grammars , 1965, JACM.

[30]  Robert Daland,et al.  Learning Diphone-Based Segmentation , 2011, Cogn. Sci..

[31]  James B. Morris Formal Languages and their Relation to Automata , 1970 .

[32]  Roger Levy,et al.  Sequential vs. Hierarchical Syntactic Models of Human Incremental Sentence Processing , 2012, CMCL@NAACL-HLT.

[33]  W. Idsardi Combinatorics for Metrical Feet , 2008, Biolinguistics.

[34]  Emil L. Post Formal Reductions of the General Combinatorial Decision Problem , 1943 .

[35]  Milton Stephen Seegmiller,et al.  Lexical insertion in a transformational grammar , 1983 .

[36]  鎌田 浩二 書評 Juan Uriagereka: Rhyme and Reason: An Introduction to Minimalist Syntax , 2000 .

[37]  Douglas Saddy,et al.  Towards a classification of Lindenmayer systems , 2018, ArXiv.

[38]  Riitta Salmelin,et al.  Using Statistical Models of Morphology in the Search for Optimal Units of Representation in the Human Mental Lexicon , 2018, Cogn. Sci..

[39]  Noam Chomsky,et al.  On Certain Formal Properties of Grammars , 1959, Inf. Control..

[40]  W. Idsardi,et al.  Metrical Combinatorics and the Real Half of the Fibonacci Sequence , 2009, Biolinguistics.

[41]  D. Terence Langendoen,et al.  Finite-State Parsing of Phrase-Structure Languages and the Status of Readjustment Rules in Grammar , 2010 .

[42]  A. Reber Implicit learning of artificial grammars , 1967 .

[43]  Angela D. Friederici,et al.  Hierarchical artificial grammar processing engages Broca's area , 2008, NeuroImage.

[44]  M. D. Martins,et al.  Distinctive signatures of recursion , 2012, Philosophical Transactions of the Royal Society B: Biological Sciences.

[45]  N. Goldberg Imprints of Dyslexia: Implicit Learning and the Cerebellum , 2014 .

[46]  A. Lindenmayer Mathematical models for cellular interactions in development. I. Filaments with one-sided inputs. , 1968, Journal of theoretical biology.

[47]  Doug Saddy Syntax and uncertainty , 2018 .

[48]  Paul M. B. Vitányi,et al.  Lindenmayer systems: structure, languages, and growth functions , 1978 .

[49]  Przemyslaw Prusinkiewicz,et al.  The Algorithmic Beauty of Plants , 1990, The Virtual Laboratory.

[50]  Michael C. Corballis,et al.  Recursion, Language, and Starlings , 2007, Cogn. Sci..