The Performance of Convex Set Projection Based Neural Networks

We consider a class of neural networks whose performance can be analyzed and geometrically visualized in a signal space environment. Alternating projection neural networks (APNN's) perform by alternately projecting between two or more constraint sets. Criteria for desired and unique convergence are easily established. The network can be configured in either a homogeneous or layered form. The number of patterns that can be stored in the network is on the order of the number of input and hidden neurons. If the output neurons can take on only one of two states, then the trained layered APNN can be easily configured to converge in one iteration. More generally, convergence is at an exponential rate. Convergence can be improved by the use of sigmoid type nonlinearities, network relaxation and/or increasing the number of neurons in the hidden layer. The manner in which the network responds to data for which it was not specifically trained (i.e. how it generalizes) can be directly evaluated analytically.

[1]  D. Youla,et al.  Image Restoration by the Method of Convex Projections: Part 1ߞTheory , 1982, IEEE Transactions on Medical Imaging.

[2]  John J. Hopfield,et al.  Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit , 1986 .

[3]  D Psaltis,et al.  Optical implementation of the Hopfield model. , 1985, Applied optics.

[4]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[6]  Robert J. Marks,et al.  Alternating projection neural networks , 1989 .

[7]  R J Marks Ii,et al.  Synchronous vs asynchronous behavior of Hopfield's CAM neural net. , 1987, Applied optics.

[8]  J. Goodman,et al.  Neural networks for computation: number representations and programming complexity. , 1986, Applied optics.

[9]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[10]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[11]  Jack Sklansky,et al.  Pattern Classifiers and Trainable Machines , 1981 .

[12]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[13]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[14]  Geoffrey E. Hinton,et al.  Learning representations by back-propagation errors, nature , 1986 .

[15]  Seho Oh,et al.  Neural Net Associative Memories Based on Convex Set Projections , 1987 .

[16]  M. Sezan,et al.  Image Restoration by the Method of Convex Projections: Part 2-Applications and Numerical Results , 1982, IEEE Transactions on Medical Imaging.

[17]  Les E. Atlas,et al.  Auditory Coding in Higher Centers of the CNS , 1987, IEEE Engineering in Medicine and Biology Magazine.

[18]  Robert J. Marks,et al.  A Class of Continuous Level Neural Nets , 1987, Other Conferences.

[19]  Robert J. Marks Class of continuous level associative memory neural nets , 1987 .

[20]  Geoffrey E. Hinton,et al.  Learning representations of back-propagation errors , 1986 .

[21]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.