Derivation of a class of training algorithms

A novel derivation is presented of T. Kohonen's topographic mapping training algorithm (Self-Organization and Associative Memory, 1984), based upon an extension of the Linde-Buzo-Gray (LBG) algorithm for vector quantizer design. Thus a vector quantizer is designed by minimizing an L(2) reconstruction distortion measure, including an additional contribution from the effect of code noise which corrupts the output of the vector quantizer. The neighborhood updating scheme of Kohonen's topographic mapping training algorithm emerges as a special case of this code noise model. This formulation of Kohonen's algorithm is a specific instance of the robust hidden layer principle, which stabilizes the internal representations chosen by a network against anticipated noise or distortion processes.

[1]  S. P. Lloyd,et al.  Least squares quantization in PCM , 1982, IEEE Trans. Inf. Theory.

[2]  Joel Max,et al.  Quantizing for minimum distortion , 1960, IRE Trans. Inf. Theory.

[3]  Stephen P. Luttrell Image compression using a multilayer neural network , 1989, Pattern Recognit. Lett..

[4]  K. Schulten,et al.  Kohonen's self-organizing maps: exploring their computational capabilities , 1988, IEEE 1988 International Conference on Neural Networks.

[5]  S. P. Luttrell,et al.  The Use of Bayesian and Entropic Methods in Neural Network Theory , 1989 .

[6]  Robert M. Gray,et al.  An Algorithm for Vector Quantizer Design , 1980, IEEE Trans. Commun..

[7]  S. P. Luttrell,et al.  Hierarchical vector quantisation , 1989 .

[8]  S. P. Luttrell Self-organising multilayer topographic mappings , 1988, IEEE 1988 International Conference on Neural Networks.

[9]  D. S. Bradburn Reducing transmission error effects using a self-organizing network , 1989, International 1989 Joint Conference on Neural Networks.

[10]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .