Image compression using principal component neural networks

Principal component analysis (PCA) is a well-known statistical processing technique that allows to study the correlations among the components of multivariate data and to reduce redundancy by projecting the data over a proper basis. The PCA may be performed both in a batch method and in a recursive fashion; the latter method has been proven to be very effective in presence of high dimension data, as in image compression. The aim of this paper is to present a comparison of principal component neural networks for still image compression and coding. We first recall basic concepts related to neural PCA, then we recall from the scientific literature a number of principal component networks, and present comparisons about the structures, the learning algorithms and the required computational efforts, along with a discussion of the advantages and drawbacks related to each technique. The conclusion of our wide comparison among eight principal component networks is that the cascade recursive least-squares algorithm by Ci-chocki, Kasprzak and Skarbek exhibits the best numerical and structural properties. q 2001 Elsevier Science B.V. All rights reserved.

[1]  佐藤 保,et al.  Principal Components , 2021, Encyclopedic Dictionary of Archaeology.

[2]  Juha Karhunen,et al.  Generalizations of principal component analysis, optimization problems, and neural networks , 1995, Neural Networks.

[3]  Mahmood R. Azimi-Sadjadi,et al.  Principal component extraction using recursive least squares learning , 1995, IEEE Trans. Neural Networks.

[4]  Andrzej Cichocki,et al.  Adaptive learning algorithm for principal component analysis with partial data , 1996 .

[5]  J. Karhunen,et al.  Nonlinear generalizations of principal component learning algorithms , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).

[6]  Juha Karhunen,et al.  Stability of Oja's PCA Subspace Rule , 1994, Neural Computation.

[7]  J. Karhunen,et al.  A bigradient optimization approach for robust PCA, MCA, and source separation , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[8]  Aurelio Uncini,et al.  A unified approach to laterally-connected neural NETS , 1998, 9th European Signal Processing Conference (EUSIPCO 1998).

[9]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[10]  S.Y. Kung,et al.  Adaptive Principal component EXtraction (APEX) and applications , 1994, IEEE Trans. Signal Process..

[11]  Gavril Toderean,et al.  Merging the transform step and the quantization step for Karhunen-Loeve transform based image compression , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[12]  M.C.F. De Castro,et al.  A complex valued Hebbian learning algorithm , 1998, 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227).

[13]  S. Fiori,et al.  A general class of /spl psi/-APEX PCA neural algorithms , 2000 .

[14]  Juha Karhunen,et al.  Nonlinear PCA type approaches for source separation and independent component analysis , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[15]  J. Rubner,et al.  A Self-Organizing Network for Principal-Component Analysis , 1989 .

[16]  Lei Xu,et al.  Least mean square error reconstruction principle for self-organizing neural-nets , 1993, Neural Networks.

[17]  Simone G. O. Fiori,et al.  Blind separation of circularly distributed sources by neural extended APEX algorithm , 2000, Neurocomputing.

[18]  Hazem M. Abbas,et al.  Neural model for Karhunen?Loe?ve trans-form with application to adaptive image compression , 1993, INFOCOM 1994.

[19]  Anil K. Jain Fundamentals of Digital Image Processing , 2018, Control of Color Imaging Systems.

[20]  Simon Haykin,et al.  Neural networks , 1994 .

[21]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[22]  Juha Karhunen,et al.  Learning of robust principal component subspace , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).

[23]  Erkki Oja,et al.  Principal components, minor components, and linear neural networks , 1992, Neural Networks.

[24]  Terence D. Sanger,et al.  Optimal unsupervised learning in a single-layer linear feedforward neural network , 1989, Neural Networks.

[25]  George Mathew,et al.  Orthogonal eigensubspace estimation using neural networks , 1994, IEEE Trans. Signal Process..

[26]  Erkki Oja,et al.  Neural Networks, Principal Components, and Subspaces , 1989, Int. J. Neural Syst..

[27]  J. Karhunen Optimization criteria and nonlinear PCA neural networks , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[28]  Juha Karhunen,et al.  Representation and separation of signals using nonlinear PCA type learning , 1994, Neural Networks.