Rate Distortion Behavior of Sparse Sources

The rate distortion behavior of sparse memoryless sources is studied. These serve as models of sparse signal representations and facilitate the performance analysis of “sparsifying” transforms like the wavelet transform and nonlinear approximation schemes. For strictly sparse binary sources with Hamming distortion, R(D) is shown to be almost linear. For nonstrictly sparse continuous-valued sources, termed compressible, two measures of compressibility are introduced: incomplete moments and geometric mean. The former lead to low- and high-rate upper bounds on mean squared error D(R), while the latter yields lower and upper bounds on source entropy, thereby characterizing asymptotic R(D) behavior. Thus, the notion of compressibility is quantitatively connected with actual lossy compression. These bounding techniques are applied to two source models: Gaussian mixtures and power laws matching the approximately scale-invariant decay of wavelet coefficients. The former are versatile models for sparse data, which in particular allow to bound high-rate compression performance of a scalar mixture compared to a corresponding unmixed transform coding system. Such a comparison is interesting for transforms with known coefficient decay, but unknown coefficient ordering, e.g., when positions of highest-variance coefficients are unknown. The use of these models and results in distributed coding and compressed sensing scenarios are also discussed.

[1]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[2]  Herbert Gish,et al.  Asymptotically efficient quantizing , 1968, IEEE Trans. Inf. Theory.

[3]  Vivek K. Goyal,et al.  Theoretical foundations of transform coding , 2001, IEEE Signal Process. Mag..

[4]  Martin Vetterli,et al.  Compressive Sampling [From the Guest Editors] , 2008, IEEE Signal Processing Magazine.

[5]  M. O. Lorenz,et al.  Methods of Measuring the Concentration of Wealth , 1905, Publications of the American Statistical Association.

[6]  Marion Kee,et al.  Analysis , 2004, Machine Translation.

[7]  Kannan Ramchandran,et al.  Duality between source coding and channel coding and its extension to the side information case , 2003, IEEE Trans. Inf. Theory.

[8]  Aaas News,et al.  Book Reviews , 1893, Buffalo Medical and Surgical Journal.

[9]  Eero P. Simoncelli 4.7 – Statistical Modeling of Photographic Images , 2005 .

[10]  Amiel Feinstein,et al.  Information and information stability of random variables and processes , 1964 .

[11]  Martin Vetterli,et al.  Rate distortion behavior of threshold-based nonlinear approximations , 2000, Proceedings DCC 2000. Data Compression Conference.

[12]  H. Rosenthal,et al.  On the epsilon entropy of mixed random variables , 1988, IEEE Trans. Inf. Theory.

[13]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[14]  J. Wolfowitz The rate distortion function for source coding with side information at the decoder , 1979 .

[15]  Robert D. Nowak,et al.  Wavelet-based statistical signal processing using hidden Markov models , 1998, IEEE Trans. Signal Process..

[16]  V.K. Goyal,et al.  Compressive Sampling and Lossy Compression , 2008, IEEE Signal Processing Magazine.

[17]  Jorma Rissanen,et al.  Universal coding, information, prediction, and estimation , 1984, IEEE Trans. Inf. Theory.

[18]  Joseph Lipka,et al.  A Table of Integrals , 2010 .

[19]  Naoki Saito,et al.  Sparsity vs. Statistical Independence in Adaptive Signal Representations: A Case Study of the Spike Process , 2001, math/0104083.

[20]  Scott T. Rickard,et al.  Comparing Measures of Sparsity , 2008, IEEE Transactions on Information Theory.

[21]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[22]  David J. Sakrison,et al.  Worst sources and robust codes for difference distortion measures , 1975, IEEE Trans. Inf. Theory.

[23]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[24]  Martin Vetterli,et al.  Distributed Sampling of Signals Linked by Sparse Filtering: Theory and Applications , 2010, IEEE Transactions on Signal Processing.

[25]  Albert Cohen,et al.  Nonlinear Approximation of Random Functions , 1997, SIAM J. Appl. Math..

[26]  Tamás Linder,et al.  On the asymptotic tightness of the Shannon lower bound , 1994, IEEE Trans. Inf. Theory.

[27]  Jay Cheng,et al.  New Bounds on the Expected Length of Optimal One-to-One Codes , 2007, IEEE Transactions on Information Theory.

[28]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[29]  Jerome M. Shapiro,et al.  Embedded image coding using zerotrees of wavelet coefficients , 1993, IEEE Trans. Signal Process..

[30]  Are Hjørungnes,et al.  Jointly optimal classification and uniform threshold quantization in entropy constrained subband image coding , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[31]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[32]  Michael T. Orchard,et al.  On the importance of combining wavelet-based nonlinear approximation with coding strategies , 2002, IEEE Trans. Inf. Theory.

[33]  Yonina C. Eldar,et al.  Blind Compressed Sensing , 2010, IEEE Transactions on Information Theory.

[34]  Richard G. Baraniuk,et al.  Recovery of Jointly Sparse Signals from Few Random Projections , 2005, NIPS.

[35]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[36]  S. Mallat A wavelet tour of signal processing , 1998 .

[37]  Aaron D. Wyner,et al.  An Upper Bound on the Entropy Series , 1972, Inf. Control..

[38]  Sundeep Rangan,et al.  On the Rate-Distortion Performance of Compressed Sensing , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.

[39]  M. West On scale mixtures of normal distributions , 1987 .

[40]  Thierry Blu,et al.  Sampling signals with finite rate of innovation , 2002, IEEE Trans. Signal Process..

[41]  Martin Vetterli,et al.  Wavelets, approximation, and compression , 2001, IEEE Signal Process. Mag..

[42]  Michel Kieffer,et al.  Source coding with intermittent and degraded side information at the decoder , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[43]  Stéphane Mallat,et al.  Analysis of low bit rate image transform coding , 1998, IEEE Trans. Signal Process..

[44]  Peter M. Schultheiss,et al.  Information rates of non-Gaussian processes , 1964, IEEE Trans. Inf. Theory.

[45]  C. Weidmann Oligoquantization in low-rate lossy source coding , 2002 .

[46]  Tamás Linder,et al.  On the rate-distortion function of random vectors and stationary sources with mixed distributions , 1999, IEEE Trans. Inf. Theory.

[47]  Allen Gersho,et al.  Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.

[48]  Ting Sun,et al.  Single-pixel imaging via compressive sampling , 2008, IEEE Signal Process. Mag..

[49]  Ram Zamir,et al.  The rate loss in the Wyner-Ziv problem , 1996, IEEE Trans. Inf. Theory.

[50]  Chris T. K. Ng,et al.  Minimum Expected Distortion in Gaussian Source Coding with Uncertain Side Information , 2007, 2007 IEEE Information Theory Workshop.

[51]  Martin Vetterli,et al.  Data Compression and Harmonic Analysis , 1998, IEEE Trans. Inf. Theory.

[52]  Michel Kieffer,et al.  Practical distributed source coding with impulse-noise degraded side information at the decoder , 2008, 2008 16th European Signal Processing Conference.

[53]  Martin Vetterli,et al.  Rate-distortion analysis of spike processes , 1999, Proceedings DCC'99 Data Compression Conference (Cat. No. PR00096).

[54]  R. Gray,et al.  A new class of lower bounds to information rates of stationary sources via conditional rate-distortion functions , 1973, IEEE Trans. Inf. Theory.

[55]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[56]  Robert M. Gray Gauss mixture vector quantization , 2001, 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221).