The Classical Capacity of Quantum Gaussian Gauge-Covariant Channels : Beyond i .

Following ISIT2016, the workshop “Beyond i.i.d. in Information Theory” was held in Barcelona. This note sketches the author’s contribution which can be considered as an extension of the Shannon lecture delivered at ISIT2016. A coding theorem for the classical capacity of a broadband gauge-covariant Gaussian channel with stationary quantum Gaussian noise is formulated and discussed.

[1]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[2]  Sekhar Tatikonda,et al.  Sparse regression codes for multi-terminal source and channel coding , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[3]  Nicolas Macris,et al.  Proof of threshold saturation for spatially coupled sparse superposition codes , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[4]  Shunsuke Ihara Error Exponent for Coding of Memoryless Gaussian Sources with a Fidelity Criterion , 2000 .

[5]  Amin Coja-Oghlan,et al.  Chasing the k-colorability threshold , 2013 .

[6]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[7]  Sekhar Tatikonda,et al.  The Rate-Distortion Function and Error Exponent of Sparse Regression Codes with Optimal Encoding , 2014 .

[8]  R. Zamir,et al.  Lattice Coding for Signals and Networks: A Structured Coding Approach to Quantization, Modulation and Multiuser Information Theory , 2014 .

[9]  Florent Krzakala,et al.  Approximate message-passing with spatially coupled structured operators, with applications to compressed sensing and sparse superposition codes , 2013, 1312.1740.

[10]  Meir Feder,et al.  Low Density Lattice Codes , 2006, ISIT.

[11]  Lenka Zdeborová,et al.  The condensation transition in random hypergraph 2-coloring , 2011, SODA.

[12]  Sanghee Cho,et al.  High-dimensional regression with random design, including sparse superposition codes , 2014 .

[13]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, ISIT.

[14]  Patrick Schulte,et al.  Bandwidth Efficient and Rate-Matched Low-Density Parity-Check Coded Modulation , 2015, IEEE Transactions on Communications.

[15]  Masanori Kawakita,et al.  Least Squares Superposition Codes With Bernoulli Dictionary are Still Reliable at Rates up to Capacity , 2014, IEEE Trans. Inf. Theory.

[16]  Florent Krzakala,et al.  Probabilistic reconstruction in compressed sensing: algorithms, phase diagrams, and threshold achieving matrices , 2012, ArXiv.

[17]  Andrea Montanari,et al.  Graphical Models Concepts in Compressed Sensing , 2010, Compressed Sensing.

[18]  Andrew R. Barron,et al.  High-rate sparse superposition codes with iteratively optimal estimates , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[19]  Sekhar Tatikonda,et al.  Lossy Compression via Sparse Linear Regression: Performance Under Minimum-Distance Encoding , 2014, IEEE Transactions on Information Theory.

[20]  R. Urbanke,et al.  Polar codes are optimal for lossy source coding , 2009 .

[21]  Ramji Venkataramanan,et al.  Finite-sample analysis of Approximate Message Passing , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[22]  Erdal Arikan,et al.  Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Memoryless Channels , 2008, IEEE Transactions on Information Theory.

[23]  Alain Glavieux,et al.  Reflections on the Prize Paper : "Near optimum error-correcting coding and decoding: turbo codes" , 1998 .

[24]  Rüdiger L. Urbanke,et al.  Modern Coding Theory , 2008 .

[25]  Adel Javanmard,et al.  Information-Theoretically Optimal Compressed Sensing via Spatial Coupling and Approximate Message Passing , 2011, IEEE Transactions on Information Theory.

[26]  Andrew R. Barron,et al.  Fast Sparse Superposition Codes Have Near Exponential Error Probability for $R<{\cal C}$ , 2014, IEEE Transactions on Information Theory.

[27]  Uri Erez,et al.  Achieving 1/2 log (1+SNR) on the AWGN channel with lattice encoding and decoding , 2004, IEEE Transactions on Information Theory.

[28]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[29]  Ramji Venkataramanan,et al.  Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding , 2015, IEEE Transactions on Information Theory.

[30]  Giuseppe Caire,et al.  Bit-Interleaved Coded Modulation , 2008, Found. Trends Commun. Inf. Theory.

[31]  Jun'ichi Takeuchi,et al.  An improved upper bound on block error probability of least squares superposition codes with unbiased Bernoulli dictionary , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[32]  Andrea Montanari,et al.  The LASSO Risk for Gaussian Matrices , 2010, IEEE Transactions on Information Theory.

[33]  Florent Krzakala,et al.  Approximate Message-Passing Decoder and Capacity Achieving Sparse Superposition Codes , 2015, IEEE Transactions on Information Theory.

[34]  Sekhar Tatikonda,et al.  Lossy compression via sparse linear regression: Computationally efficient encoding and decoding , 2013, ISIT.

[35]  Sanghee Cho,et al.  APPROXIMATE ITERATIVE BAYES OPTIMAL ESTIMATES FOR HIGH-RATE SPARSE SUPERPOSITION CODES , 2013 .

[36]  Sundeep Rangan,et al.  Generalized approximate message passing for estimation with random linear mixing , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[37]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[38]  G. David Forney,et al.  Modulation and Coding for Linear Gaussian Channels , 1998, IEEE Trans. Inf. Theory.

[39]  Rüdiger L. Urbanke,et al.  Spatially coupled ensembles universally achieve capacity under belief propagation , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.