To code or not to code

The theory and practice of digital communication during the past 50 years has been strongly influenced by Shannon's separation theorem. While it is conceptually and practically appealing to separate source from channel coding, either step requires infinite delay in general for optimal performance. On the other extreme is uncoded transmission, which has no delay but is suboptimal in general. In this paper, necessary and sufficient conditions for the optimality of uncoded transmission are shown. These conditions allow the construction of arbitrary examples of optimal uncoded transmission (beyond the well-known Gaussian example).

[1]  Raymond Knopp,et al.  Information capacity and power control in single-cell multiuser communications , 1995, Proceedings IEEE International Conference on Communications ICC '95.

[2]  Sergio Verdu,et al.  The exponential distribution in information theory , 1996 .

[3]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[4]  Jacob Ziv,et al.  On functionals satisfying a data-processing theorem , 1973, IEEE Trans. Inf. Theory.

[5]  M. Vetterli,et al.  On source/channel codes of finite block length , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[6]  Robert M. Gray,et al.  On the asymptotic eigenvalue distribution of Toeplitz matrices , 1972, IEEE Trans. Inf. Theory.

[7]  Michael Gastpar,et al.  On the necessary density for spectrum-blind nonuniform sampling , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[8]  Toby Berger,et al.  All sources are nearly successively refinable , 2001, IEEE Trans. Inf. Theory.

[9]  William Equitz,et al.  Successive refinement of information , 1991, IEEE Trans. Inf. Theory.

[10]  Leo I. Bluestein,et al.  Transmission of analog waveforms through channels with feedback (Corresp.) , 1967, IEEE Trans. Inf. Theory.

[11]  Yoram Bresler,et al.  On the necessary density for spectrum-blind nonuniform sampling subject to quantization , 2000, 2000 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.00CH37100).

[12]  Jack K. Wolf,et al.  The capacity region of a multiple-access discrete memoryless channel can increase with feedback (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[13]  Mohammad Reza Aref,et al.  The capacity of the semideterministic relay channel , 1982, IEEE Trans. Inf. Theory.

[14]  Jim K. Omura,et al.  Optimum linear transmission of analog data for channels with feedback , 1968, IEEE Trans. Inf. Theory.

[15]  R. Gray Conditional Rate-Distortion Theory , 1972 .

[16]  R. Koetter,et al.  A coding view of network recovery and management , 2002, Proceedings IEEE International Symposium on Information Theory,.

[17]  James L Massey Joint Source and Channel Coding , 1977 .

[18]  David Tse,et al.  Multiaccess Fading Channels-Part II: Delay-Limited Capacities , 1998, IEEE Trans. Inf. Theory.

[19]  Robert J. McEliece,et al.  The Theory of Information and Coding , 1979 .

[20]  Aydano B. Carleial,et al.  Multiple-access channels with different generalized feedback signals , 1982, IEEE Trans. Inf. Theory.

[21]  Prakash Narayan,et al.  Reliable Communication Under Channel Uncertainty , 1998, IEEE Trans. Inf. Theory.

[22]  David Tse,et al.  Multiaccess Fading Channels-Part I: Polymatroid Structure, Optimal Resource Allocation and Throughput Capacities , 1998, IEEE Trans. Inf. Theory.

[23]  Giuseppe Longo,et al.  The information theory approach to communications , 1977 .

[24]  Madhu Sudan,et al.  Priority encoding transmission , 1996, IEEE Trans. Inf. Theory.

[25]  Toby Berger,et al.  The quadratic Gaussian CEO problem , 1997, IEEE Trans. Inf. Theory.

[26]  E. Meulen,et al.  Three-terminal communication channels , 1971, Advances in Applied Probability.

[27]  H. Vincent Poor,et al.  An Introduction to Signal Detection and Estimation , 1994, Springer Texts in Electrical Engineering.

[28]  Dimitri P. Bertsekas,et al.  Data Networks , 1986 .

[29]  Patrick P. Bergmans,et al.  Random coding theorem for broadcast channels with degraded components , 1973, IEEE Trans. Inf. Theory.

[30]  Michael Gastpar,et al.  On optimal low-complexity source/channel coding , 2000 .

[31]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[32]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[33]  Shlomo Shamai,et al.  Systematic Lossy Source/Channel Coding , 1998, IEEE Trans. Inf. Theory.

[34]  M. Vetterli,et al.  On the asymptotic capacity of Gaussian relay networks , 2002, Proceedings IEEE International Symposium on Information Theory,.

[35]  J. Massey CAUSALITY, FEEDBACK AND DIRECTED INFORMATION , 1990 .

[36]  Abbas El Gamal,et al.  Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.

[37]  Zhen Zhang,et al.  On the CEO problem , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[38]  Thomas M. Cover,et al.  Broadcast channels , 1972, IEEE Trans. Inf. Theory.

[39]  Sergio Verdú,et al.  The source-channel separation theorem revisited , 1995, IEEE Trans. Inf. Theory.

[40]  Gerhard Kramer,et al.  Directed information for channels with feedback , 1998 .

[41]  Kannan Ramchandran,et al.  Multiresolution Joint Source and Channel Coding , 1998 .

[42]  Michael C. Gastpar,et al.  Two-stage Wiener filter based cancellation receivers for DS-CDMA , 1996 .

[43]  Robert G. Gallager,et al.  Capacity and coding for degraded broadcast channels , 1974 .

[44]  Ram Zamir,et al.  The rate loss in the Wyner-Ziv problem , 1996, IEEE Trans. Inf. Theory.

[45]  M. Vetterli,et al.  To Code, Or Not To Code: On the Optimality of Symbol-by-Symbol Communication , 2001 .

[46]  Frans M. J. Willems,et al.  The discrete memoryless multiple access channel with partially cooperating encoders , 1983, IEEE Trans. Inf. Theory.

[47]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[48]  N. Phamdo,et al.  A duality theorem for joint source-channel coding , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).

[49]  Sergio Verdú,et al.  Bits through queues , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[50]  Emre Telatar,et al.  Capacity of Multi-antenna Gaussian Channels , 1999, Eur. Trans. Telecommun..

[51]  G. Kramer,et al.  A spectral criterion for feedback coding , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).

[52]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[53]  Yasutada Oohama Gaussian multiterminal source coding , 1997, IEEE Trans. Inf. Theory.

[54]  J. Pieter M. Schalkwijk,et al.  A coding scheme for additive noise channels with feedback-II: Band-limited signals , 1966, IEEE Trans. Inf. Theory.

[55]  Panganamala Ramana Kumar,et al.  RHEINISCH-WESTFÄLISCHE TECHNISCHE HOCHSCHULE AACHEN , 2001 .

[56]  Kenneth Rose,et al.  A mapping approach to rate-distortion computation and analysis , 1994, IEEE Trans. Inf. Theory.

[57]  David Tse,et al.  Asymptotically optimal water-filling in vector multiple-access channels , 2001, IEEE Trans. Inf. Theory.

[58]  Thomas J. Goblick,et al.  Theoretical limitations on the transmission of data from analog sources , 1965, IEEE Trans. Inf. Theory.

[59]  T. Kailath,et al.  A coding scheme for additive noise channels with feedback, Part I: No bandwith constraint , 1998 .

[60]  Thomas Kailath,et al.  An application of Shannon's rate-distortion theory to analog communication over feedback channels , 1967 .

[61]  Peter Elias,et al.  Networks of Gaussian channels with applications to feedback systems , 1967, IEEE Trans. Inf. Theory.

[62]  Michael Gastpar,et al.  The multiple-relay channel: coding and antenna-clustering capacity , 2002, Proceedings IEEE International Symposium on Information Theory,.

[63]  Izhak Rubin,et al.  Information rates and data-compression schemes for Poisson processes , 1974, IEEE Trans. Inf. Theory.

[64]  Bixio Rimoldi,et al.  Successive refinement of information: characterization of the achievable rates , 1994, IEEE Trans. Inf. Theory.

[65]  Michael Horstein,et al.  Sequential transmission using noiseless feedback , 1963, IEEE Trans. Inf. Theory.

[66]  S. Laughlin,et al.  The rate of information transfer at graded-potential synapses , 1996, Nature.

[67]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[68]  J. Wolfowitz The rate distortion function for source coding with side information at the decoder , 1979 .

[69]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[70]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[71]  D. R. Fulkerson,et al.  Flows in Networks. , 1964 .

[72]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[73]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[74]  Michael Gastpar,et al.  On the capacity of wireless networks: the relay case , 2002, Proceedings.Twenty-First Annual Joint Conference of the IEEE Computer and Communications Societies.

[75]  Shlomo Shamai,et al.  Capacity of channels with uncoded side information , 1995, Eur. Trans. Telecommun..

[76]  B. Rimoldi Beyond the separation principle: A broader approach to source-channel coding , 2002 .

[77]  Fady Alajaji,et al.  Optimistic Shannon coding theorems for arbitrary single-user systems , 1999, IEEE Trans. Inf. Theory.

[78]  Thomas Kailath,et al.  A coding scheme for additive noise channels with feedback-I: No bandwidth constraint , 1966, IEEE Trans. Inf. Theory.

[79]  Toby Berger,et al.  Multiterminal source encoding with one distortion criterion , 1989, IEEE Trans. Inf. Theory.

[80]  R. Gallager Information Theory and Reliable Communication , 1968 .

[81]  R. Gallager,et al.  The Gaussian parallel relay network , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[82]  Toby Berger,et al.  Rate-distortion for correlated sources with partially separated encoders , 1982, IEEE Trans. Inf. Theory.

[83]  Robert G. Gallager,et al.  Finding parity in a simple broadcast network , 1988, IEEE Trans. Inf. Theory.

[84]  Meir Feder,et al.  Joint source-channel coding of a Gaussian mixture source over the Gaussian broadcast channel , 2002, IEEE Trans. Inf. Theory.

[85]  U. S. Ganguly Achievement of Rate-Distortion Bound over Additive White Noise Channel Utilizing a Noiseless Feedback channel , 1965 .

[86]  Yasutada Oohama,et al.  The Rate-Distortion Function for the Quadratic Gaussian CEO Problem , 1998, IEEE Trans. Inf. Theory.

[87]  R. Dudley,et al.  Uniform Central Limit Theorems: Notation Index , 2014 .

[88]  Gunter Dueck,et al.  Partial Feedback for Two-Way and Broadcast Channels , 1980, Inf. Control..

[89]  Zhen Zhang,et al.  Partial converse for a relay channel , 1988, IEEE Trans. Inf. Theory.

[90]  Vilius Ivanauskas Conferences , 1979 .

[91]  Claude E. Shannon,et al.  The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.