Infinite RAAM : A Principled Connectionist Substrate for Cog nitive Modeling

Unification-based approaches have come to play an important role in both theoretical and applied modeling of cognitive p rocesses, most notably natural language. Attempts to model such processes using neural networks have met with some success, but have faced serious hurdles caused by the limita tions of standard connectionist coding schemes. As a contri bution to this effort, this paper presents recent work in Infi nite RAAM (IRAAM), a new connectionist unification model. Based on a fusion of recurrent neural networks with fractal geometry, IRAAM allows us to understand the behavior of these networks as dynamical systems. Using a logical programming language as our modeling domain, we show how this dynamical-systems approach solves many of the problem s faced by earlier connectionist models, supporting unificat ion over arbitrarily large sets of recursive expressions. We co nclude that IRAAM can provide a principled connectionist sub strate for unification in a variety of cognitive modeling domains. Language and Connectionism: Three

[1]  J. A. Robinson,et al.  A Machine-Oriented Logic Based on the Resolution Principle , 1965, JACM.

[2]  Nils J. Nilsson,et al.  Artificial Intelligence , 1974, IFIP Congress.

[3]  William F. Clocksin,et al.  Programming in Prolog , 1987, Springer Berlin Heidelberg.

[4]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[5]  James L. McClelland,et al.  On learning the past-tenses of English verbs: implicit rules or parallel distributed processing , 1986 .

[6]  Stuart M. Shieber,et al.  An Introduction to Unification-Based Approaches to Grammar , 1986, CSLI Lecture Notes.

[7]  S. Pinker,et al.  On language and connectionism: Analysis of a parallel distributed processing model of language acquisition , 1988, Cognition.

[8]  J. Fodor,et al.  Connectionism and cognitive architecture: A critical analysis , 1988, Cognition.

[9]  Arnaud E. Jacquin,et al.  Application Of Recurrent Iterated Function Systems To Images , 1988, Other Conferences.

[10]  Michael F. Barnsley,et al.  Fractals everywhere , 1988 .

[11]  T. Horgan,et al.  Representations without Rules , 1989 .

[12]  Ronald J. Williams,et al.  A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.

[13]  Tim van Gelder,et al.  Compositionality: A Connectionist Variation on a Classical Theme , 1990, Cogn. Sci..

[14]  Jordan B. Pollack,et al.  Recursive Distributed Representations , 1990, Artif. Intell..

[15]  David J. Chalmers,et al.  Syntactic Transformations on Distributed Representations , 1990 .

[16]  W. Daniel Hillis,et al.  Co-evolving parasites improve simulated evolution as an optimization procedure , 1990 .

[17]  Douglas S. Blank,et al.  Exploring the Symbolic/Subsymbolic Continuum: A case study of RAAM , 1992 .

[18]  H T Siegelmann,et al.  Dating and Context of Three Middle Stone Age Sites with Bone Points in the Upper Semliki Valley, Zaire , 2007 .

[19]  Jordan B. Pollack,et al.  Co-Evolving Intertwined Spirals , 1996, Evolutionary Programming.

[20]  Paul Rodríguez,et al.  A Recurrent Neural Network that Learns to Count , 1999, Connect. Sci..

[21]  Jordan B. Pollack,et al.  Representation of information in neural networks , 2000 .

[22]  Whitney Tabor,et al.  Fractal encoding of context‐free grammars in connectionist networks , 2000, Expert Syst. J. Knowl. Eng..