Parsing to Learn

Learning a language by parameter setting is almost certainly less onerous than composing a grammar from scratch. But recent computational modeling of how parameters are set has shown that it is not at all the simple mechanical process sometimes imagined. Sentences must be parsed to discover the properties that select between parameter values. But the sentences that drive learning cannot be parsed with the learner's current grammar. And there is not much point in parsing them with just one new grammar. They must apparently be parsed with all possible grammars, in order to find out which one is most successful at licensing the language. The research task is to reconcile this with the fact that the human sentence parsing mechanism, even in adults, has only very limited parallel parsing capacity. I have proposed that all possible grammars can be folded into one, if parameter values are fragments of sentential tree structures that the parser can make use of where necessary to assign a structure to an input sentence. However, the problem of capacity limitations remains. The combined grammar will afford multiple analyses for some sentences, too many to be computed on-line. I propose that the parser computes only one analysis per sentence but can detect ambiguity, and that the learner makes use of unambiguous input only. This provides secure information but relatively little of it, particularly at early stages of learning where few grammars have been excluded and ambiguity is rife. I consider three solutions: improving the parser's ability to extract unambiguous information from partially ambiguous sentences, assuming default parameter values to temporarily eliminate ambiguity, reconfiguring the parameters so that some are subordinate to others and do not present themselves to the learner until the others have been set. A more radical alternative is to give up the quest for error-free learning and permit parameters to be set without regard for whether the parser may have overlooked an alternative analysis of the sentence. If it can be assumed that the human parser keeps a running tally of the parameter values it has accessed, then the learner would do nothing other than parse sentences for comprehension, as adults do. The most useful parameter values would become more and more easily accessed; the noncontributors would drop out of the running. There would be no learning mechanism at all, over and above the parser. But how accurate this system would be remains to be established.

[1]  Letitia R. Naigles,et al.  Learnability and Cognition: The Acquisition of Argument Structure , 1991 .

[2]  J. Grimshaw Projection, heads, and optimality , 1997 .

[3]  Robin Clark,et al.  Complexity and the Induction of Tree Adjoining Grammars , 1996 .

[4]  Edward Gibson,et al.  A computational theory of human linguistic processing: memory limitations and processing breakdown , 1991 .

[5]  Andrew Radford,et al.  Syntactic Theory and the Acquisition of English Syntax: The Nature of Early Child Grammars of English , 1990 .

[6]  Anne Christophe,et al.  Selecting word order: the Rhytmic Activation Principle , 1996 .

[7]  Mitchell P. Marcus,et al.  A theory of syntactic recognition for natural language , 1979 .

[8]  Noam Chomsky,et al.  वाक्यविन्यास का सैद्धान्तिक पक्ष = Aspects of the theory of syntax , 1965 .

[9]  S. Bertolo Maturation and Learnability in Parametric Systems , 1995 .

[10]  Stephen Crain,et al.  Phrase structure parameters , 1990 .

[11]  L. Gleitman The Structural Sources of Verb Meanings , 2020, Sentence First, Arguments Afterward.

[12]  A. Inoue,et al.  Information-paced parsing of Japanese , 1995 .

[13]  Janet Dean Fodor,et al.  The diagnosis and cure of garden paths , 1994 .

[14]  Janet D. Fodor Unambiguous Triggers , 1998, Linguistic Inquiry.

[15]  George A. Miller,et al.  Introduction to the Formal Analysis of Natural Languages , 1968 .

[16]  T. Roeper,et al.  How to Make Parameters Work: Comments on Valian , 1990 .

[17]  Noam Chomsky,et al.  Filters and Control , 1990 .

[18]  H. Clahsen Constraints on Parameter Setting: A Grammatical Analysis of Some Acquisition Stages in German Child Language , 1990 .

[19]  Kenneth Wexler,et al.  Formal Principles of Language Acquisition , 1980 .

[20]  D. Terence Langendoen,et al.  Limitations on Embedding in Coordinate Structures , 1998 .

[21]  Robert C. Berwick,et al.  The acquisition of syntactic knowledge , 1985 .

[22]  Robin Clark On the Relationship between the Input Data and Parameter Setting , 1987 .

[23]  Edward Gibson,et al.  Characterizing learnability conditions for cue-based learners in parametric language systems , 1997 .

[24]  Noam Chomsky,et al.  Lectures on Government and Binding , 1981 .

[25]  Stefano Bertolo Learnability properties of parametric models for natural language acquisition , 1995 .

[26]  P. Niyogi,et al.  Learning from triggers , 1996 .

[27]  Bezalel Elan Dresher,et al.  Charting the Learning Path: Cues to Parameter Setting , 1999, Linguistic Inquiry.

[28]  Anne Vainikka,et al.  Case in the Development of English Syntax , 1993 .

[29]  Viviane Déprez,et al.  Negation and functional projections in early grammar , 1993 .

[30]  Robin Clark,et al.  A Computational Model of Language Learnability and Language Change , 2018, Diachronic and Comparative Syntax.

[31]  Virginia Valian,et al.  Logical and Psychological Constraints on the Acquisition of Syntax , 1990 .

[32]  Noam Chomsky,et al.  The Minimalist Program , 1992 .