Bayesian alternatives to neural computing

This paper investigates two types of neural organization-Hebbian and perceptron learning. Hebbian neural learning merely serves to summarize input, whereas perceptron learning adjusts to meet systems objectives. Bayesian models have also been proposed as archetypes for human learning, providing decisions in an uncertain environment. Bayesian analogs to Hebbian and perceptron learning were constructed and found to respond more smoothly and predictably than neural models. They tend to discount information that is already known and provide smoother transitions from one revision to the next, even at relatively high learning rates. When a Bayesian system receives no evidence about a given parameter, activation levels decline, and the system "forgets". The Bayesian analogs of Hebbian and perceptron learning move roughly in tandem with neural networks, and yield similar decisions. But Bayesian models offer statistical performance metrics useful in the design and development of systems. The Bayesian analogs retain the features that attract engineers to neural networks, while dispelling uneasiness associated with the "black box" character of neural systems. >