Ordered neural maps and their applications to data compression

The implicit ordering in scalar quantization is used to substantiate the need for explicit ordering in vector quantization and the ordering of Kohonen's neural net vector quantizer is shown to provide a multidimensional analog to this scalar quantization ordering. Ordered vector quantization, using Kohonen's neural net, was successfully applied to image coding and was then shown to be advantageous for progressive transmission. In particular, the intermediate images had a signal-to-noise ratio that was quite close to a standard tree-structured vector quantizer, while the final full-fidelity image from the neural net vector quantizer was superior to the tree-structured vector quantizer. Subsidiary results include a new definition of index of disorder which was empirically found to correlate strongly with the progressive reduction of image signal-to-noise ratio and a hybrid neural net-generalized Lloyd training algorithm which has a high final image signal-to-noise ratio while still maintaining ordering.<<ETX>>