On the Minimum Entropy of a Mixture of Unimodal and Symmetric Distributions

Progressive encoding of a signal generally involves an estimation step, designed to reduce the entropy of the residual of an observation over the entropy of the observation itself. Oftentimes the conditional distributions of an observation, given already-encoded observations, are well fit within a class of symmetric and unimodal distributions (e.g., the two-sided geometric distributions in images of natural scenes, or symmetric Paretian distributions in models of financial data). It is common practice to choose an estimator that centers, or aligns, the modes of the conditional distributions, since it is common sense that this will minimize the entropy, and hence the coding cost of the residuals. But with the exception of a special case, there has been no rigorous proof. Here we prove that the entropy of an arbitrary mixture of symmetric and unimodal distributions is minimized by aligning the modes. The result generalizes to unimodal and rotation-invariant distributions in Rn. We illustrate the result through some experiments with natural images.

[1]  Edwin B. Stear,et al.  Entropy Analysis of Parameter Estimation , 1969, Inf. Control..

[2]  A.N. Netravali,et al.  Picture coding: A review , 1980, Proceedings of the IEEE.

[3]  R. Engle Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation , 1982 .

[4]  Nariman Farvardin,et al.  Adaptive buffer-instrumented entropy-coded quantizer performance for memoryless sources , 1986, IEEE Trans. Inf. Theory.

[5]  T. Bollerslev,et al.  A CONDITIONALLY HETEROSKEDASTIC TIME SERIES MODEL FOR SPECULATIVE PRICES AND RATES OF RETURN , 1987 .

[6]  S. A. Martucci,et al.  Reversible compression of HDTV images using median adaptive prediction and arithmetic coding , 1990, IEEE International Symposium on Circuits and Systems.

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  Jeffrey Scott Vitter,et al.  Error modeling for hierarchical lossless image compression , 1992, Data Compression Conference, 1992..

[9]  Glen G. Langdon,et al.  Centering of context-dependent components of prediction-error distributions of images , 1993, Optics & Photonics.

[10]  Martin Janzura,et al.  Minimum Entropy of Error Principle in Estimation , 1994, Inf. Sci..

[11]  Antonín Otáhal Minimum entropy of error estimate for multi-dimensional parameter and finite-state-space observations , 1995, Kybernetika.

[12]  Guillermo Sapiro,et al.  LOCO-I: a low complexity, context-based, lossless image compression algorithm , 1996, Proceedings of Data Compression Conference - DCC '96.

[13]  Martin Janzura,et al.  Minimum entropy of error estimation for discrete random variables , 1996, IEEE Trans. Inf. Theory.

[14]  Xiaolin Wu,et al.  Lossless compression of continuous-tone images via context selection, quantization, and modeling , 1997, IEEE Trans. Image Process..

[15]  Kris Popat Lossy compression of gray-scale document images by adaptive-offset quantization , 2000, IS&T/SPIE Electronic Imaging.

[16]  Guillermo Sapiro,et al.  The LOCO-I lossless image compression algorithm: principles and standardization into JPEG-LS , 2000, IEEE Trans. Image Process..

[17]  Jamal Ouenniche,et al.  A note on return distribution of UK stock indices , 2005 .

[18]  Thomas M. Cover,et al.  Elements of Information Theory: Cover/Elements of Information Theory, Second Edition , 2005 .