The Symbol Grounding Problem 2 . 1 The Chinese Room

There has been much discussion recently about the scope and limits of purely symbolic models of the mind and about the proper role of connectionism in cognitive modeling. This paper describes the "symbol grounding problem": How can the semantic interpretation of a formal symbol system be made intrinsic to the system, rather than just parasitic on the meanings in our heads? How can the meanings of the meaningless symbol tokens, manipulated solely on the basis of their (arbitrary) shapes, be grounded in anything but other meaningless symbols? The problem is analogous to trying to learn Chinese from a Chinese/Chinese dictionary alone. A candidate solution is sketched: Symbolic representations must be grounded bottom-up in nonsymbolic representations of two kinds: (1) "iconic representations" , which are analogs of the proximal sensory projections of distal objects and events, and (2) "categorical representations" , which are learned and innate feature-detectors that pick out the invariant features of object and event categories from their sensory projections. Elementary symbols are the names of these object and event categories, assigned on the basis of their (nonsymbolic) categorical representations. Higher-order (3) "symbolic representations" , grounded in these elementary symbols, consist of symbol strings describing category membership relations (e.g., "An X is a Y that is Z"). Connectionism is one natural candidate for the mechanism that learns the invariant features underlying categorical representations, thereby connecting names to the proximal projections of the distal objects they stand for. In this way connectionism can be seen as a complementary component in a hybrid nonsymbolic/symbolic model of the mind, rather than a rival to purely symbolic modeling. Such a hybrid model would not have an autonomous symbolic "module," however; the symbolic functions would emerge as an intrinsically "dedicated" symbol system as a consequence of the bottom-up grounding of categories' names in their sensory representations. Symbol manipulation would be governed not just by the arbitrary shapes of the symbol tokens, but by the nonarbitrary shapes of the icons and category invariants in which they are grounded.

[1]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[2]  J. Turkkan Classical conditioning: The new hegemony , 1989, Behavioral and Brain Sciences.

[3]  John Haugeland The nature and plausibility of Cognitivism , 1978, Behavioral and Brain Sciences.

[4]  S. C. Kleene,et al.  Formalized Recursive Functionals and Formalized Realizability , 1969 .

[5]  S. Harnad Categorical Perception: The Groundwork of Cognition , 1990 .

[6]  Martin D. Davis,et al.  Computability and Unsolvability , 1959, McGraw-Hill Series in Information Processing and Computers.

[7]  S. S. Culbert,et al.  Cognition and Categorization , 1979 .

[8]  John R. Searle,et al.  Minds, brains, and programs , 1980, Behavioral and Brain Sciences.

[9]  A. Catania,et al.  The Selection of Behavior. The Operant Behaviorism of BF Skinner: Comments and Consequences , 1988 .

[10]  S. Ullman Against direct perception , 1980, Behavioral and Brain Sciences.

[11]  Z. Pylyshyn Robot's Dilemma: The Frame Problem in Artificial Intelligence , 1987 .

[12]  S. Harnad Psychophysical and cognitive aspects of categorical perception: A critical overview , 1987 .

[13]  John McCarthy,et al.  SOME PHILOSOPHICAL PROBLEMS FROM THE STANDPOINT OF ARTI CIAL INTELLIGENCE , 1987 .

[14]  S. Harnad Metaphor and Mental Duality , 2019, Language, Mind, and Brain.

[15]  Michael Barr,et al.  The Emperor's New Mind , 1989 .

[16]  J. Lucas Minds, Machines and Gödel , 1961, Philosophy.

[17]  R. Shepard,et al.  Mental Images and Their Transformations , 1982 .

[18]  Drew McDermott,et al.  Artificial intelligence meets natural stupidity , 1976, SGAR.

[19]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[20]  A. Paivio Mental Representations: A Dual Coding Approach , 1986 .

[21]  Stevan Harnad Against Computational Hermeneutics , 1990 .

[22]  Zenon W. Pylyshyn,et al.  Computation and Cognition: Toward a Foundation for Cognitive Science , 1984 .

[23]  Précis of The Modularity of Mind , 1988 .

[24]  Saul A. Kripke,et al.  Naming and Necessity , 1980 .

[25]  A. Liberman On Finding That Speech Is Special , 1982 .

[26]  J. Fodor Methodological solipsism considered as a research strategy in cognitive psychology , 1980, Behavioral and Brain Sciences.

[27]  A. M. Turing,et al.  Computing Machinery and Intelligence , 1950, The Philosophy of Artificial Intelligence.

[28]  Marvin Minsky,et al.  A framework for representing knowledge , 1974 .

[29]  Stevan Harnad,et al.  Minds, machines and Searle , 1989, J. Exp. Theor. Artif. Intell..

[30]  Marvin Minsky,et al.  Perceptrons: An Introduction to Computational Geometry , 1969 .

[31]  Z. Pylyshyn Computation and cognition: issues in the foundations of cognitive science , 1980, Behavioral and Brain Sciences.

[32]  Allen Newell,et al.  Physical Symbol Systems , 1980, Cogn. Sci..