Generating neural circuits that implement probabilistic reasoning.

We extend the hypothesis that neuronal populations represent and process analog variables in terms of probability density functions (PDFs). Aided by an intermediate representation of the probability density based on orthogonal functions spanning an underlying low-dimensional function space, it is shown how neural circuits may be generated from Bayesian belief networks. The ideas and the formalism of this PDF approach are illustrated and tested with several elementary examples, and in particular through a problem in which model-driven top-down information flow influences the processing of bottom-up sensory input.