An Overview of How Semantics and Corrections Can Help Language Learning

We present an overview of the results obtained with a computational model that takes into account semantics and corrections for language learning. This model is constructed with a learner and a teacher who interact in a sequence of shared situations. The model was tested with limited sublanguages of 10 natural languages in a common domain of situations.

[1]  Carl de Marcken,et al.  Unsupervised language acquisition , 1996, ArXiv.

[2]  Brian Scassellati,et al.  Robotic vocabulary building using extension inference and implicit contrast , 2009, Artificial Intelligence.

[3]  Susan Ervin-Tripp,et al.  SOME STRATEGIES FOR THE FIRST TWO YEARS , 1973 .

[4]  E. Clark,et al.  Adult reformulations of child errors as negative evidence , 2003, Journal of Child Language.

[5]  Catherine L. Harris,et al.  The human semantic potential: Spatial language and constrained connectionism , 1997 .

[6]  H. Clark,et al.  In cognitive development and the acquisition of language , 1973 .

[7]  Leonor Becerra-Bonache,et al.  A Model of Semantics and Corrections in Language Learning , 2010 .

[8]  Jerome A. Feldman,et al.  When push comes to shove: a computational model of the role of motor control in the acquisition of action verbs , 1997 .

[9]  José Oncina,et al.  Learning Stochastic Regular Grammars by Means of a State Merging Method , 1994, ICGI.

[10]  Andreas Stolcke,et al.  Miniature Language Acquisition: A Touchstone for Cognitive Science , 2002 .

[11]  Brian Scassellati,et al.  Using sentence context and implicit contrast to learn sensor-grounded meanings for relational and deictic words: the twig system , 2008 .

[12]  J. Siskind A computational study of cross-situational techniques for learning word-to-meaning mappings , 1996, Cognition.

[13]  Jeffrey Mark Siskind,et al.  Lexical Acquisition in the Presence of Noise and Homonymy , 1994, AAAI.

[14]  Leonor Becerra-Bonache,et al.  Effects of Meaning-Preserving Corrections on Language Learning , 2011, CoNLL.