Parametric connectivity: feasibility of learning in constrained weight space

Consideration is given to the impact on the performance of selected learning algorithms when specific artificial neural models are constrained. The particular model of constraint under consideration is parametric connectivity (PC), in which the weights of the incoming links are constrained to be a function of a relatively small number of parameters. This can, in principle, be implemented in an electrooptical system, using such devices as photodetectors, miniature electrooptical cells, and laser diodes. Low-resolution holographic mirrors may be used to direct the global structure of the network architecture. A simulation using PC has been developed. Currently, layered PC networks that implement simple logic functions are being investigated. The performance of networks that use PC units (PCU) is measured. PC is incorporated into the generalized delta rule and into genetic algorithms to measure learning capacity. PC allows almost complete generality in network implementation, while taking advantage of optical system performance.<<ETX>>

[1]  Lawrence D. Jackel,et al.  Neural network chips , 1988, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[2]  Mark A. Fanty,et al.  Computing with structured neural networks , 1988, Computer.

[3]  Y. Owechko,et al.  Programmable multilayer optical neural networks with asymmetric interconnection weights , 1988, IEEE 1988 International Conference on Neural Networks.

[4]  L. Chua,et al.  Chaos: A tutorial for engineers , 1987, Proceedings of the IEEE.

[5]  W. Hicks,et al.  The SAIC delta neurocomputer architecture , 1988 .