A multilayer perceptron in the Chebyshev norm for image data compression

The author verifies the image data compression and generalization characteristics of feedforward neural networks trained with the Chebyshev norm backpropagation algorithm, which allows a sensible reduction of the computational cost. It is shown that the use of the L/sub infinity / norm algorithm greatly alleviates the problem of training phase computational cost, which is particularly relevant in the case of image data processing. This reduction in the computational cost does not appreciably affect the generalization performance of the network.<<ETX>>