Publisher Summary This chapter focuses on two classes of widely used artificial neural networks (ANNs), the perceptron-back-propagation and the Hopfield–Boltzmann machine models. It explores in detail the characteristics of a simple feedforward ANN model. The perceptron-back-propagation model is presented in the chapter in an intuitive applications-oriented style. The chapter describes the Hopfield–Boltzmann machine ANN models. ANNs represent a fundamentally different approach to computation. They are explicitly designed to mimic the basic organizational features of biological nervous systems—parallel, distributed processing. ANNs have also been called “parallel distributed processing,” “connectionist,” and “neuromorphic systems.” ANNs are novel, robust computational tools for pattern recognition, data mapping, and other applications. The demonstrated success of ANN techniques has lead to an explosion of interest among scientific and engineering circles. Both commercial and public domain ANN software exists for a variety of computer systems ranging from small personal computers to large, massively parallel supercomputers. ANNs consist of a large number of simple interconnected processing elements, where the processing elements are simplified models of neurons and the interconnections between the processing elements are simplified models of the synapses between neurons.
[1]
Geoffrey E. Hinton,et al.
A Learning Algorithm for Boltzmann Machines
,
1985,
Cogn. Sci..
[2]
D. O. Hebb,et al.
The organization of behavior
,
1988
.
[3]
C. D. Gelatt,et al.
Optimization by Simulated Annealing
,
1983,
Science.
[4]
F ROSENBLATT,et al.
The perceptron: a probabilistic model for information storage and organization in the brain.
,
1958,
Psychological review.
[5]
T. Sejnowski,et al.
Predicting the secondary structure of globular proteins using neural network models.
,
1988,
Journal of molecular biology.
[6]
W. Kabsch,et al.
Dictionary of protein secondary structure: Pattern recognition of hydrogen‐bonded and geometrical features
,
1983,
Biopolymers.
[7]
F. R. A. Hopgood,et al.
Machine Intelligence 6
,
1972,
The Mathematical Gazette.
[8]
Donald Geman,et al.
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
,
1984,
IEEE Transactions on Pattern Analysis and Machine Intelligence.
[9]
Terrence J. Sejnowski,et al.
Parallel Networks that Learn to Pronounce English Text
,
1987,
Complex Syst..
[10]
Robert A. Jacobs,et al.
Increased rates of convergence through learning rate adaptation
,
1987,
Neural Networks.