On Privacy Amplification, Lossy Compression, and Their Duality to Channel Coding

We examine the task of privacy amplification from information-theoretic and coding-theoretic points of view. In the former, we give a one-shot characterization of the optimal rate of privacy amplification against classical adversaries in terms of the optimal type-II error in asymmetric hypothesis testing. The converse significantly improves on previous bounds based on smooth min-entropy by Watanabe and Hayashi <xref ref-type="bibr" rid="ref7">[7]</xref> and turns out to be equivalent to a recent formulation in terms of the <inline-formula> <tex-math notation="LaTeX">$E_\gamma $ </tex-math></inline-formula> divergence by Yang <italic>et al.</italic> <xref ref-type="bibr" rid="ref9">[9]</xref>. In the latter, we show that the protocols for privacy amplification based on linear codes can be easily repurposed for channel simulation. Combined with the known relations between channel simulation and lossy source coding, this implies that the privacy amplification can be understood as a basic primitive for both channel simulation and lossy compression. Applied to symmetric channels or lossy compression settings, our construction leads to protocols of the optimal rate in the asymptotic i.i.d. limit. Finally, appealing to the notion of channel duality recently detailed by us in <xref ref-type="bibr" rid="ref15">[15]</xref>, we show that the linear error-correcting codes for symmetric channels with quantum output can be transformed into linear lossy source coding schemes for classical variables arising from the dual channel. This explains a “curious duality” in these problems for the (self-dual) erasure channel observed by Martinian and Yedidia <xref ref-type="bibr" rid="ref16">[16]</xref> and partly anticipates recent results on optimal lossy compression by polar and low-density generator matrix codes.

[1]  R. Renner,et al.  Generalized Entropies , 2012, 1211.3141.

[2]  Joseph M. Renes,et al.  Alignment of Polarized Sets , 2014, IEEE Journal on Selected Areas in Communications.

[3]  Leonid A. Levin,et al.  Pseudo-random generation from one-way functions , 1989, STOC '89.

[4]  H. Nagaoka,et al.  Strong converse theorems in the quantum information theory , 1999, 1999 Information Theory and Networking Workshop (Cat. No.99EX371).

[5]  Renato Renner,et al.  Simple and Tight Bounds for Information Reconciliation and Privacy Amplification , 2005, ASIACRYPT.

[6]  Larry Carter,et al.  Universal Classes of Hash Functions , 1979, J. Comput. Syst. Sci..

[7]  Saikat Guha,et al.  Polar Codes for Classical-Quantum Channels , 2011, IEEE Transactions on Information Theory.

[8]  Igor Devetak,et al.  Channel Simulation With Quantum Side Information , 2009, IEEE Transactions on Information Theory.

[9]  Masahito Hayashi,et al.  Non-asymptotic analysis of privacy amplification via Rényi entropy and inf-spectral entropy , 2012, 2013 IEEE International Symposium on Information Theory.

[10]  Thomas M. Cover,et al.  Elements of information theory (2. ed.) , 2006 .

[11]  Mark M. Wilde,et al.  Quantum Rate Distortion, Reverse Shannon Theorems, and Source-Channel Separation , 2011, IEEE Transactions on Information Theory.

[12]  Masahito Hayashi,et al.  A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks , 2012, IEEE Transactions on Information Theory.

[13]  E. S. Pearson,et al.  On the Problem of the Most Efficient Tests of Statistical Hypotheses , 1933 .

[14]  Sergio Verdú,et al.  Simulation of random processes and rate-distortion theory , 1996, IEEE Trans. Inf. Theory.

[15]  Masahito Hayashi,et al.  Uniform Random Number Generation From Markov Chains: Non-Asymptotic and Asymptotic Analyses , 2015, IEEE Transactions on Information Theory.

[16]  Paul W. Cuff,et al.  Distributed Channel Synthesis , 2012, IEEE Transactions on Information Theory.

[17]  Nilanjana Datta,et al.  One-Shot Entanglement-Assisted Quantum and Classical Communication , 2011, IEEE Transactions on Information Theory.

[18]  H. Vincent Poor,et al.  Channel coding: non-asymptotic fundamental limits , 2010 .

[19]  Peter W. Shor,et al.  Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem , 2001, IEEE Trans. Inf. Theory.

[20]  H. Chernoff LARGE-SAMPLE THEORY: PARAMETRIC CASE' , 1956 .

[21]  Rudolf Ahlswede,et al.  Common randomness in information theory and cryptography - I: Secret sharing , 1993, IEEE Trans. Inf. Theory.

[22]  Joseph M. Renes,et al.  Duality of privacy amplification against quantum adversaries and data compression with quantum side information , 2010, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[23]  R. Schumann Quantum Information Theory , 2000, quant-ph/0010060.

[24]  Emin Martinian,et al.  Iterative Quantization Using Codes On Graphs , 2004, ArXiv.

[25]  Sergio Verdú,et al.  $E_{ {\gamma }}$ -Resolvability , 2015, IEEE Transactions on Information Theory.

[26]  David J. C. MacKay,et al.  Good Error-Correcting Codes Based on Very Sparse Matrices , 1997, IEEE Trans. Inf. Theory.

[27]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[28]  G. Barnard The Theory of Information , 1951 .

[29]  Nicolas Macris,et al.  Approaching the rate-distortion limit by spatial coupling with belief propagation and decimation , 2013, 2013 IEEE International Symposium on Information Theory.

[30]  A. Winter Compression of sources of probability distributions and density operators , 2002, quant-ph/0208131.

[31]  Rüdiger L. Urbanke,et al.  Polar Codes are Optimal for Lossy Source Coding , 2009, IEEE Transactions on Information Theory.

[32]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[33]  Joseph M. Renes,et al.  One-Shot Lossy Quantum Data Compression , 2013, IEEE Transactions on Information Theory.

[34]  Masahito Hayashi,et al.  Tight Exponential Analysis of Universally Composable Privacy Amplification and Its Applications , 2010, IEEE Transactions on Information Theory.

[35]  Robert J. Vanderbei,et al.  Linear Programming: Foundations and Extensions , 1998, Kluwer international series in operations research and management service.

[36]  H. Vincent Poor,et al.  Wiretap Channels: Nonasymptotic Fundamental Limits , 2017, IEEE Transactions on Information Theory.

[37]  M. Tomamichel A framework for non-asymptotic quantum information theory , 2012, 1203.2142.

[38]  Rudolf Ahlswede,et al.  Common Randomness in Information Theory and Cryptography - Part II: CR Capacity , 1998, IEEE Trans. Inf. Theory.

[39]  Joseph M. Renes,et al.  Noisy Channel Coding via Privacy Amplification and Information Reconciliation , 2010, IEEE Transactions on Information Theory.

[40]  Noam Nisan,et al.  Randomness is Linear in Space , 1996, J. Comput. Syst. Sci..

[41]  Masahito Hayashi,et al.  General nonasymptotic and asymptotic formulas in channel resolvability and identification capacity and their application to the wiretap channel , 2006, IEEE Transactions on Information Theory.

[42]  Amin Gohari,et al.  Non-asymptotic output statistics of Random Binning and its applications , 2013, 2013 IEEE International Symposium on Information Theory.

[43]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[44]  Ueli Maurer,et al.  Generalized privacy amplification , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[45]  Gilles Brassard,et al.  Privacy Amplification by Public Discussion , 1988, SIAM J. Comput..

[46]  Nicolas Macris,et al.  Approaching the Rate-Distortion Limit With Spatial Coupling, Belief Propagation, and Decimation , 2013, IEEE Transactions on Information Theory.

[47]  Edgar Reich THE THEORY OF INFORMATION , 1950 .