Simplicity as a driving force in linguistic evolution

How did language come to have its characteristic structure? Many argue that by understanding those parts of our biological machinery relevant to language, we can explain why language is the way it is. If the hallmarks of language are simply properties of our biological machinery, elicited through the process of language acquisition, then such an explanatory route is adequate. As soon as we admit the possibility that knowledge of language is learned, in the sense that language acquisition is a process involving inductive generalisations, then an explanatory inadequacy arises. Any thorough explanation of the characteristic structure of language must now explain why the input to the language acquisition process has certain properties and not others. This thesis builds on recent work that proposes that the linguistic stimulus has certain structural properties that arise as a result of linguistic evolution. Here, languages themselves adapt to fit the task of learning: they reflect an accumulated structural residue laid down by previous generations of language users. Using computational models of linguistic evolution I explore the relationship be¬ tween language induction and generalisation based on a simplicity principle, and the linguistic evolution of compositional structures. The two main contributions of this thesis are as follows. Firstly, using a model of induction based on the minimum description length principle, I address the question of linguistic evolution resulting from a bias towards compression. Secondly, I carry out a thorough examination of the parameter space affecting the cultural transmission of language, and note that the conditions for linguistic evolution towards compositional structure correspond to (1) specific levels of semantic complexity, and (2), induction based on sparse language exposure.

[1]  Noam Chomsky,et al.  The faculty of language: what is it, who has it, and how did it evolve? , 2002, Science.

[2]  C. S. Wallace,et al.  An Information Measure for Classification , 1968, Comput. J..

[3]  John McCarthy,et al.  What Computers Still Can't Do , 1996, Artif. Intell..

[4]  Simon Kirby,et al.  Cultural selection for learnability : Three hypotheses concerning the characteristic structure of language , 2005 .

[5]  R. Shepard,et al.  Toward a universal law of generalization for psychological science. , 1987, Science.

[6]  Ted Briscoe Grammatical acquisition: Inductive bias and coevolution of language and the language acquisition device , 2000 .

[7]  Kenny Smith,et al.  The cultural evolution of communication in a population of neural networks , 2002, Connect. Sci..

[8]  C. Lumsden Culture and the Evolutionary Process, Robert Boyd, Peter J. Richerson. University of Chicago Press, Chicago & London (1985), viii, +301. Price $29.95 , 1986 .

[9]  R. Jackendoff Foundations of Language: Brain, Meaning, Grammar, Evolution , 2002 .

[10]  G. Zipf,et al.  The Psycho-Biology of Language , 1936 .

[11]  J. Stevenson The cultural origins of human cognition , 2001 .

[12]  Barbara C. Scholz,et al.  Empirical assessment of stimulus poverty arguments , 2002 .

[13]  P. Thomas Schoenemann,et al.  Syntax as an Emergent Characteristic of the Evolution of Semantic Complexity , 1999, Minds and Machines.

[14]  Simon Kirby,et al.  Situated Cognition and the Role of Multi-agent Models in Explaining Language Structure , 2002, Adaptive Agents and Multi-Agents Systems.

[15]  Noam Chomsky,et al.  The Sound Pattern of English , 1968 .

[16]  P. Glendinning Stability, Instability and Chaos: An Introduction to the Theory of Nonlinear Differential Equations , 1994 .

[17]  Willem H. Zuidema How the Poverty of the Stimulus Solves the Poverty of the Stimulus , 2002, NIPS.

[18]  S. Pinker,et al.  Natural language and natural selection , 1990, Behavioral and Brain Sciences.

[19]  Philip Lieberman,et al.  The Biology and Evolution of Language , 1984 .

[20]  K von Frisch,et al.  Decoding the language of the bee. , 1974, Science.

[21]  M. Dryer The Greenbergian word order correlations , 1992 .

[22]  Alec Marantz,et al.  The minimalist program , 1995 .

[23]  Jacob Feldman,et al.  Minimization of Boolean complexity in human concept learning , 2000, Nature.

[24]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[25]  R. Dawkins The Extended Phenotype , 1982 .

[26]  John Haugeland Mind Design II: Philosophy, Psychology, Artificial Intelligence , 1997 .

[27]  N. Chater The Search for Simplicity: A Fundamental Cognitive Principle? , 1999 .

[28]  Howard Lasnik,et al.  The minimalist program in syntax , 2002, Trends in Cognitive Sciences.

[29]  Chris Mellish,et al.  Advances in Instance Selection for Instance-Based Learning Algorithms , 2002, Data Mining and Knowledge Discovery.

[30]  Charles E. Taylor,et al.  Effects of Compression on Language Evolution , 2000, Artificial Life.

[31]  Michael Oliphant,et al.  The Learning Barrier: Moving from Innate to Learned Systems of Communication , 1999, Adapt. Behav..

[32]  Khadija Iqbal,et al.  An introduction , 1996, Neurobiology of Aging.

[33]  P. Mundinger,et al.  Animal cultures and a general theory of cultural evolution , 1980 .

[34]  Robin Clark,et al.  Language Acquisition and Learnability: Information theory, complexity and linguistic descriptions , 2001 .