Parameterization of connectionist models

We present a method for estimating parameters of connectionist models that allows the model’s output to fit as closely as possible to empirical data. The method minimizes a cost function that measures the difference between statistics computed from the model’s output and statistics computed from the subjects’ performance. An optimization algorithm finds the values of the parameters that minimize the value of this cost function. The cost function also indicates whether the model’s statistics are significantly different from the data’s. In some cases, the method can find the optimal parameters automatically. In others, the method may facilitate the manual search for optimal parameters. The method has been implemented in Matlab, is fully documented, and is available for free download from the Psychonomic Society Web archive atwww.psychonomic.org/archive/.

[1]  C. Eriksen,et al.  Effects of noise letters upon the identification of a target letter in a nonsearch task , 1974 .

[2]  K M Spencer,et al.  The lateralized readiness potential: relationship between human data and response activation in a connectionist model. , 1999, Psychophysiology.

[3]  Clay B. Holroyd,et al.  The neural basis of human error processing: reinforcement learning, dopamine, and the error-related negativity. , 2002, Psychological review.

[4]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[5]  John R. Anderson,et al.  Rules of the Mind , 1993 .

[6]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[7]  Denis Cousineau,et al.  PASTIS: A program for curve and distribution analyses , 1997 .

[8]  Karl-Rudolf Koch,et al.  Parameter estimation and hypothesis testing in linear models , 1988 .

[9]  P. Holmes,et al.  MODELING A SIMPLE CHOICE TASK: STOCHASTIC DYNAMICS OF MUTUALLY INHIBITORY NEURAL GROUPS , 2001 .

[10]  James L. McClelland,et al.  A parallel distributed processing approach to automaticity. , 1992, The American journal of psychology.

[11]  Jonathan D. Cohen,et al.  The neural basis of error detection: conflict monitoring and the error-related negativity. , 2004, Psychological review.

[12]  T. Rowan Functional stability analysis of numerical algorithms , 1990 .

[13]  Carl Tim Kelley,et al.  Iterative methods for optimization , 1999, Frontiers in applied mathematics.

[14]  R. Ratcliff,et al.  Estimating parameters of the diffusion model: Approaches to dealing with contaminant reaction times and parameter variability , 2002, Psychonomic bulletin & review.

[15]  R. Ratcliff,et al.  Connectionist and diffusion models of reaction time. , 1999, Psychological review.

[16]  M. Botvinick,et al.  Conflict monitoring and cognitive control. , 2001, Psychological review.

[17]  James L. McClelland,et al.  The time course of perceptual choice: the leaky, competing accumulator model. , 2001, Psychological review.

[18]  Roger Ratcliff,et al.  A Theory of Memory Retrieval. , 1978 .

[19]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[20]  James L. McClelland,et al.  On the control of automatic processes: a parallel distributed processing account of the Stroop effect. , 1990, Psychological review.

[21]  G. Kane Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models , 1994 .

[22]  N. Meyers,et al.  H = W. , 1964, Proceedings of the National Academy of Sciences of the United States of America.