The many blessings of abstraction: A commentary on Ambridge (2020)

Ambridge argues that the existence of exemplar models for individual phenomena (words, inflection rules, etc.) suggests the feasibility of a unified, exemplars-everywhere model that eschews abstraction. The argument would be strengthened by a description of such a model. However, none is provided. I show that any attempt to do so would immediately run into significant difficulties – difficulties that illustrate the utility of abstractions. I conclude with a brief review of modern symbolic approaches that address the concerns Ambridge raises about abstractions.

[1]  Terry Regier,et al.  The Emergence of Words: Attentional Learning in Form and Meaning , 2005, Cogn. Sci..

[2]  Joshua B. Tenenbaum,et al.  Human-level concept learning through probabilistic program induction , 2015, Science.

[3]  J. Fodor,et al.  The red herring and the pet fish: why concepts still can't be prototypes , 1996, Cognition.

[4]  B. Ambridge Against stored abstractions: A radical exemplar model of language acquisition , 2020, First Language.

[5]  Charles Kemp,et al.  How to Grow a Mind: Statistics, Structure, and Abstraction , 2011, Science.

[6]  Thomas L. Griffiths,et al.  Modeling human categorization of natural images using deep feature representations , 2017, CogSci.

[7]  R. Nosofsky Relations between exemplar-similarity and likelihood models of classification , 1990 .

[8]  R. Jacobs,et al.  Learning abstract visual concepts via probabilistic program induction in a Language of Thought , 2017, Cognition.

[9]  Hinrich Schütze,et al.  Multilevel Exemplar Theory , 2010, Cogn. Sci..

[10]  Noah D. Goodman,et al.  Learning a theory of causality. , 2011, Psychological review.

[11]  Mark Johnson,et al.  Grammar induction from (lots of) words alone , 2016, COLING.

[12]  Noah D. Goodman,et al.  Pyro: Deep Universal Probabilistic Programming , 2018, J. Mach. Learn. Res..

[13]  Michael K. Tanenhaus,et al.  The Weckud Wetch of the Wast: Lexical Adaptation to a Novel Accent , 2008, Cogn. Sci..

[14]  J. Fodor Special sciences (or: The disunity of science as a working hypothesis) , 1974, Synthese.

[15]  B. MacWhinney The role of competition and timeframes: A commentary on Ambridge (2020) , 2020 .

[16]  J. Tenenbaum,et al.  An integrative computational architecture for object-driven cortex , 2019, Current Opinion in Neurobiology.

[17]  J. Tenenbaum,et al.  Variability, negative evidence, and the acquisition of verb argument constructions. , 2010, Journal of child language.

[18]  Razvan Pascanu,et al.  Relational inductive biases, deep learning, and graph networks , 2018, ArXiv.

[19]  Timothy O'Donnell,et al.  Productivity and Reuse in Language: A Theory of Linguistic Computation and Storage , 2015 .