New Frontiers in Computational Models of Grammatical Development Micah B. Goldwater 1 (micahbg@gmail.com), Scott Friedman 2 , Dedre Gentner 1 , Ken Forbus 2 Department of Psychology 1 ; Department of Electrical Engineering & Computer Science 2 , Northwestern University Evanston, IL 60208 USA Cynthia L. Fisher 1 (clfishe@cyrus.psych.illinois.edu), Michael Connor 2 , Dan Roth 2 Department of Psychology; Department of Computer Science, University of Illinois at Urbana-Champagne Champaign, IL 61820 USA Franklin Chang (Franklin.Chang@liverpool.ac.uk) School of Psychology, University of Liverpool Liverpool, L69 7ZA UK Gary S. Dell (gdell@cyrus.psych.illinois.edu) Department of Psychology, University of Illinois at Urbana-Champagne Champaign, IL 61820 USA Keywords: computational models; language development; syntax; thematic roles innate biases linking syntax and semantics and learned grammatical categories to semantically parse sentences and guide word learning. The second talk, by Franklin Chang will describe a connectionist model that does not require explicit thematic roles or innate linking rules to learn syntax, but instead can acquire language from visual-spatial input. The third talk, by Micah Goldwater, presents a third approach—a usage-based model that uses structured symbolic representations and learns abstract thematic roles via analogical abstraction The symposium begins with a brief introduction, followed by three presentations of computational models, and concludes with a discussion exploring the issues. Dedre Gentner will introduce the symposium. She is the Alice Gabrielle Twight Professor of Psychology and Education at Northwestern University. Gary S. Dell will serve as the discussant. He is Professor of Psychology and Linguistics at University of Illinois at Urbana-Champaign. We now summarize each talk in turn. Introduction How children acquire the grammar of their native language has been a central topic in cognitive science since its outset, and has been the focus of much debate. One view assumed an innate Universal Grammar which genetically endowed the child with highly structured knowledge of language (Chomsky, 1965). An opposing position argued against both the assumptions of innate knowledge and structured representations, instead using connectionist architectures with distributed representations to learn grammatical patterns (e.g., Rumelhart & McClelland, 1986). The field has progressed. There have been many years of rigorous empirical work, detailing the developmental pattern in children. In parallel, AI and cognitive science have made many advances in sophisticated learning algorithms. This symposium brings together models on the forefront of such empirical and computational research. Each model has roots in both sides of the early debate, positing (at least some) structured representations and specifying learning mechanisms. However, the models differ in many crucial ways. They use different computational architectures, learning algorithms, and differ in the knowledge built into the system. These differences in the models reflect and build on different current theories of grammatical development. The models focus on simulating empirical phenomena critical in distinguishing such theories. This symposium presents a unique opportunity to compare these new approaches and invite an open discussion. The Origin of Syntactic Computational Model Bootstrapping: A Syntactic bootstrapping proposes that children use knowledge of sentence structure in sentence interpretation and verb learning. We present a computational model of the origins of syntactic bootstrapping, based on systems for automatic semantic-role labeling (SRL). SRL models learn to identify sentence constituents that fill semantic roles, and to determine their roles, such as agent, patient, or goal. The present 'BabySRL' instantiates the structure-mapping account of syntactic bootstrapping (Fisher et al., 2010). We assume a structure-mapping process between the nouns in a sentence and the core semantic arguments of the verbs, in which children are biased to create one-to-one mappings. Given this one-to-one mapping bias, the number of nouns in the sentence becomes intrinsically meaningful to toddlers. Second, this account proposes that children's representations of sentences, though partially specified, are couched in Symposium Structure This symposium will present three computational models of grammatical development. The first talk, by Cindy Fisher, presents a model rooted in early abstraction theories of language development (e.g., Fisher, 2002). The model is implemented in a machine learning architecture that uses its
[1]
Susan Foster-Cohen,et al.
CONSTRUCTING A LANGUAGE: A USAGE-BASED THEORY OF LANGUAGE ACQUISITION
,
2004,
Studies in Second Language Acquisition.
[2]
Cynthia Fisher,et al.
The role of abstract syntactic knowledge in language acquisition: a reply to Tomasello (2000)
,
2002,
Cognition.
[3]
James L. McClelland,et al.
On learning the past-tenses of English verbs: implicit rules or parallel distributed processing
,
1986
.
[4]
Sylvia Yuan,et al.
Syntactic bootstrapping.
,
2010,
Wiley interdisciplinary reviews. Cognitive science.
[5]
G. Dell,et al.
Becoming syntactic.
,
2006,
Psychological review.
[6]
Brian Falkenhainer,et al.
The Structure-Mapping Engine: Algorithm and Examples
,
1989,
Artif. Intell..
[7]
Kenneth D. Forbus,et al.
Modeling Infant Learning via Symbolic Structural Alignment
,
2000
.
[8]
Julian M. Pine,et al.
Constructing a Language: A Usage-Based Theory of Language Acquisition.
,
2004
.
[9]
G. Altmann,et al.
The time-course of prediction in incremental sentence processing: Evidence from anticipatory eye-movements
,
2003
.
[10]
F. Chang.
Symbolically speaking: a connectionist model of sentence production
,
2002
.
[11]
Noam Chomsky,et al.
वाक्यविन्यास का सैद्धान्तिक पक्ष = Aspects of the theory of syntax
,
1965
.
[12]
Franklin Chang,et al.
Learning to order words: A connectionist model of heavy NP shift and accessibility effects in Japanese and English
,
2009
.