A multilayer perceptron in the Chebyshev norm for image data compression
暂无分享,去创建一个
The author verifies the image data compression and generalization characteristics of feedforward neural networks trained with the Chebyshev norm backpropagation algorithm, which allows a sensible reduction of the computational cost. It is shown that the use of the L/sub infinity / norm algorithm greatly alleviates the problem of training phase computational cost, which is particularly relevant in the case of image data processing. This reduction in the computational cost does not appreciably affect the generalization performance of the network.<<ETX>>
[1] S. Miyake,et al. Image data compression using a neural network model , 1989, International 1989 Joint Conference on Neural Networks.
[2] Giovanni L. Sicuranza,et al. Artificial neural network for image compression , 1990 .
[3] Pietro Burrascano,et al. A norm selection criterion for the generalized delta rule , 1991, IEEE Trans. Neural Networks.
[4] P. Burrascano,et al. A learning rule in the Chebyshev norm for multilayer perceptrons , 1990 .