Lossless coding of hyperspectral images with principal polynomial analysis

The transform in image coding aims to remove redundancy among data coefficients so that they can be independently coded, and to capture most of the image information in few coefficients. While the second goal ensures that discarding coefficients will not lead to large errors, the first goal ensures that simple (point-wise) coding schemes can be applied to the retained coefficients with optimal results. Principal Component Analysis (PCA) provides the best independence and data compaction for Gaussian sources. Yet, non-linear generalizations of PCA may provide better performance for more realistic non-Gaussian sources. Principal Polynomial Analysis (PPA) generalizes PCA by removing the non-linear relations among components using regression, and was analytically proved to perform better than PCA in dimensionality reduction. We explore here the suitability of reversible PPA for lossless compression of hyperspectral images. We found that reversible PPA performs worse than PCA due to the high impact of the rounding operation errors and to the amount of side information. We then propose two generalizations: Backwards PPA, where polynomial estimations are performed in reverse order, and Double-Sided PPA, where more than a single dimension is used in the predictions. Both yield better coding performance than canonical PPA and are comparable to PCA.

[1]  Valero Laparra,et al.  Nonlinear data description with Principal Polynomial Analysis , 2012, 2012 IEEE International Workshop on Machine Learning for Signal Processing.

[2]  Matthias Scholz,et al.  Validation of Nonlinear PCA , 2012, Neural Processing Letters.

[3]  Valero Laparra,et al.  Principal Polynomial Analysis , 2014, Int. J. Neural Syst..

[4]  B Huhle,et al.  Kernel PCA for Image Compression , 2006 .

[5]  Pengwei Hao,et al.  Reversible integer KLT for progressive-to-lossless compression of multiple component images , 2003, Proceedings 2003 International Conference on Image Processing (Cat. No.03CH37429).

[6]  Saverio Salzo,et al.  Lossless hyperspectral compression using KLT , 2004, IGARSS 2004. 2004 IEEE International Geoscience and Remote Sensing Symposium.

[7]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[8]  Valero Laparra,et al.  Nonlinearities and Adaptation of Color Vision from Sequential Principal Curves Analysis , 2016, Neural Computation.

[9]  Pengwei Hao,et al.  Matrix factorizations for reversible integer mapping , 2001, IEEE Trans. Signal Process..

[10]  Jarkko Venna,et al.  Information Retrieval Perspective to Nonlinear Dimensionality Reduction for Data Visualization , 2010, J. Mach. Learn. Res..

[11]  Enrico Magli,et al.  Transform Coding Techniques for Lossy Hyperspectral Data Compression , 2007, IEEE Transactions on Geoscience and Remote Sensing.

[12]  Joachim Selbig,et al.  Non-linear PCA: a missing data approach , 2005, Bioinform..

[13]  Ian Blanes,et al.  Cost and Scalability Improvements to the Karhunen–Loêve Transform for Remote-Sensing Image Coding , 2010, IEEE Transactions on Geoscience and Remote Sensing.