Learned Iterative Decoding for Lossy Image Compression Systems

For lossy image compression systems, we develop an algorithm called iterative refinement, to improve the decoder's reconstruction compared with standard decoding techniques. Specifically, we propose a recurrent neural network approach for nonlinear, iterative decoding. Our neural decoder, which can work with any encoder, employs self-connected memory units that make use of both causal and non-causal spatial context information to progressively reduce reconstruction error over a fixed number of steps. We experiment with variations of our proposed estimator and obtain as much as a 0.8921 decibel (dB) gain over the standard JPEG algorithm and a 0.5848 dB gain over a state-of-the-art neural compression model.

[1]  Zhou Wang,et al.  Group MAD Competition? A New Methodology to Compare Objective Image Quality Models , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[2]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[3]  Yoshua Bengio,et al.  Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.

[4]  Michael W. Spratling A review of predictive coding algorithms , 2017, Brain and Cognition.

[5]  David Minnen,et al.  Variable Rate Image Compression with Recurrent Neural Networks , 2015, ICLR.

[6]  Giulia Boato,et al.  RAISE: a raw images dataset for digital image forensics , 2015, MMSys.

[7]  Koray Kavukcuoglu,et al.  Pixel Recurrent Neural Networks , 2016, ICML.

[8]  Valero Laparra,et al.  End-to-end Optimized Image Compression , 2016, ICLR.

[9]  Joshua Snoke,et al.  Using Neural Generative Models to Release Synthetic Twitter Corpora with Reduced Stylometric Identifiability of Users. , 2016 .

[10]  David Reitter,et al.  Learning Simpler Language Models with the Differential State Framework , 2017, Neural Computation.

[11]  Karvel K. Thornber,et al.  Equivalence in knowledge representation: automata, recurrent neural networks, and dynamical fuzzy systems , 1999, Proc. IEEE.

[12]  David Minnen,et al.  Full Resolution Image Compression with Recurrent Neural Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[13]  C. Lee Giles,et al.  Learning, Representation, and Synthesis of Discrete Dynamical Systems in Continuous Recurrent Neural , 1995 .

[14]  David Minnen,et al.  Improved Lossy Image Compression with Priming and Spatially Adaptive Bit Rates for Recurrent Networks , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[15]  C. Lee Giles,et al.  Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks , 1992, Neural Computation.

[16]  Majid Rabbani,et al.  An overview of the JPEG 2000 still image compression standard , 2002, Signal Process. Image Commun..

[17]  Eero P. Simoncelli,et al.  Image quality assessment: from error visibility to structural similarity , 2004, IEEE Transactions on Image Processing.

[18]  Wuzhen Shi,et al.  An End-to-End Compression Framework Based on Convolutional Neural Networks , 2017, 2017 Data Compression Conference (DCC).

[19]  Andrea Giachetti,et al.  TESTIMAGES: A Large Data Archive For Display and Algorithm Testing , 2013, J. Graph. Tools.

[20]  Nir Shavit,et al.  Generative Compression , 2017, 2018 Picture Coding Symposium (PCS).

[21]  Gregory K. Wallace,et al.  The JPEG still picture compression standard , 1992 .

[22]  Junfeng He,et al.  Detecting Doctored JPEG Images Via DCT Coefficient Analysis , 2006, ECCV.

[23]  Daan Wierstra,et al.  Towards Conceptual Compression , 2016, NIPS.

[24]  Lucas Theis,et al.  Lossy Image Compression with Compressive Autoencoders , 2017, ICLR.

[25]  David Zhang,et al.  Learning Convolutional Networks for Content-Weighted Image Compression , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[26]  Richard S. Zemel,et al.  Generative Moment Matching Networks , 2015, ICML.

[27]  Luca Benini,et al.  Soft-to-Hard Vector Quantization for End-to-End Learned Compression of Images and Neural Networks , 2017, ArXiv.

[28]  PAUL J. WERBOS,et al.  Generalization of backpropagation with application to a recurrent gas market model , 1988, Neural Networks.

[29]  Yoshua Bengio,et al.  High quality document image compression with "DjVu" , 1998, J. Electronic Imaging.

[30]  Vladlen Koltun,et al.  Learning to Inpaint for Image Compression , 2017, NIPS.

[31]  Michael S. Brown,et al.  A Contrast Enhancement Framework with JPEG Artifacts Suppression , 2014, ECCV.

[32]  L. Alparone,et al.  Context modeling for near-lossless image coding , 2002, IEEE Signal Processing Letters.

[33]  Hongyang Chao,et al.  Building Dual-Domain Representations for Compression Artifacts Reduction , 2016, ECCV.

[34]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[35]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[36]  Léon Bottou,et al.  Wasserstein Generative Adversarial Networks , 2017, ICML.

[37]  Bolei Zhou,et al.  Places: A 10 Million Image Database for Scene Recognition , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.