Parallel, self-organizing, hierarchical neural networks. II

For pt.I see IEEE Trans. Neural Networks, vol.1, p.167-78 (1990). Parallel, self-organizing, hierarchical neural networks (PSHNNs) involve a number of stages with error detection at the end of each stage, i.e., rejection of error-causing vectors, which are then fed into the next stage after a nonlinear transformation. The stages operate in parallel during testing. Statistical properties and the mechanisms of vector rejection of the PSHNN are discussed in comparison to the maximum likelihood method and the backpropagation network. The PSHNN is highly fault tolerant and robust against errors in the weight values due to the adjustment of the error detection bounds to compensate errors in the weight values. These properties are exploited to develop architectures for programmable implementations in which the programmable parts are reduced to on-off or bipolar switching operations for bulk computations and attenuators for pointwise operations. >

[1]  D. Youla,et al.  Image Restoration by the Method of Convex Projections: Part 1ߞTheory , 1982, IEEE Transactions on Medical Imaging.

[2]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[3]  J. Makhoul,et al.  Formation of disconnected decision regions with a single hidden layer , 1989, International 1989 Joint Conference on Neural Networks.

[4]  Okan K. Ersoy On relating discrete Fourier, sine, and symmetric cosine transforms , 1985, IEEE Trans. Acoust. Speech Signal Process..

[5]  Daesik Hong,et al.  Parallel, self-organizing, hierarchical neural networks , 1990, IEEE Trans. Neural Networks.

[6]  Joel Max,et al.  Quantizing for minimum distortion , 1960, IRE Trans. Inf. Theory.

[7]  Bernard Widrow,et al.  Adaptive switching circuits , 1988 .

[8]  Daesik Hong,et al.  Neural network learning paradigms involving nonlinear spectral processing , 1989, International Conference on Acoustics, Speech, and Signal Processing,.