Multimodal Cognitive Architecture: Making Perception More Central to Intelligent Behavior

I propose that the notion of cognitive state be broadened from the current predicate-symbolic, Language-of-Thought framework to a multi-modal one, where perception and kinesthetic modalities participate in thinking. In contrast to the roles assigned to perception and motor activities as modules external to central cognition in the currently dominant theories in AI and Cognitive Science, in the proposed approach, central cognition incorporates parts of the perceptual machinery. I motivate and describe the proposal schematically, and describe the implementation of a bi-modal version in which a diagrammatic representation component is added to the cognitive state. The proposal explains our rich multimodal internal experience, and can be a key step in the realization of embodied agents. The proposed multimodal cognitive state can significantly enhance the agent's problem solving.

[1]  C. Lebiere,et al.  The Atomic Components of Thought , 1998 .

[2]  Richard Reviewer-Granger Unified Theories of Cognition , 1991, Journal of Cognitive Neuroscience.

[3]  Sean A. Spence,et al.  Descartes' Error: Emotion, Reason and the Human Brain , 1995 .

[4]  L. Barsalou,et al.  Whither structured representation? , 1999, Behavioral and Brain Sciences.

[5]  Refractor Vision , 2000, The Lancet.

[6]  B. Chandrasekaran,et al.  An Architecture for Problem Solving with Diagrams , 2004, Diagrams.