Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression
暂无分享,去创建一个
[1] Sergio Verdú,et al. On the Interplay Between Conditional Entropy and Error Probability , 2010, IEEE Transactions on Information Theory.
[2] Klaudia Frankfurter. Computers And Intractability A Guide To The Theory Of Np Completeness , 2016 .
[3] Serdar Boztas,et al. Comments on 'An inequality on guessing and its application to sequential decoding' , 1997, IEEE Trans. Inf. Theory.
[4] Muriel Médard,et al. Guessing with limited memory , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).
[5] Neri Merhav,et al. Guessing Subject to Distortion , 1998, IEEE Trans. Inf. Theory.
[6] Raymond W. Yeung,et al. The Interplay Between Entropy and Variational Distance , 2007, IEEE Transactions on Information Theory.
[7] L. Campbell,et al. Definition of entropy by means of a coding problem , 1966 .
[8] G. Crooks. On Measures of Entropy and Information , 2015 .
[9] Sergio Verdú,et al. Rejection Sampling and Noncausal Sampling Under Moment Constraints , 2018, 2018 IEEE International Symposium on Information Theory (ISIT).
[10] Sergio Verdú,et al. Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics , 2014, IEEE Transactions on Information Theory.
[11] Luisa Gargano,et al. Information theoretic measures of distances and their econometric applications , 2013, 2013 IEEE International Symposium on Information Theory.
[12] Imre Csiszár. Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.
[13] Mark M. Wilde,et al. Strong converse theorems using Rényi entropies , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[14] Sergio Verdú,et al. Variable-length lossy compression and channel coding: Non-asymptotic converses via cumulant generating functions , 2014, 2014 IEEE International Symposium on Information Theory.
[15] Lukasz Rudnicki,et al. Majorization entropic uncertainty relations , 2013, ArXiv.
[16] Venkat Anantharam,et al. Optimal sequences and sum capacity of synchronous CDMA systems , 1999, IEEE Trans. Inf. Theory.
[17] Suguru Arimoto,et al. On the converse to the coding theorem for discrete memoryless channels (Corresp.) , 1973, IEEE Trans. Inf. Theory.
[18] Igal Sason. On the Rényi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem , 2016, IEEE Transactions on Information Theory.
[19] Masahito Hayashi,et al. Operational interpretation of Rényi conditional mutual information via composite hypothesis testing against Markov distributions , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[20] Robert J. McEliece,et al. An inequality on entropy , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.
[21] Suhas N. Diggavi,et al. The effect of bias on the guesswork of hash functions , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).
[22] Aydin Sezgin,et al. Applications of Majorization Theory in Space-Time Cooperative Communications , 2010 .
[23] Milán Mosonyi,et al. Quantum Hypothesis Testing and the Operational Interpretation of the Quantum Rényi Relative Entropies , 2013, ArXiv.
[24] Neri Merhav,et al. Joint Source-Channel Coding and Guessing with Application to Sequential Decoding , 1998, IEEE Trans. Inf. Theory.
[25] Guojun Gan,et al. Data Clustering: Theory, Algorithms, and Applications (ASA-SIAM Series on Statistics and Applied Probability) , 2007 .
[26] Rajesh Sundaresan,et al. The Shannon Cipher System With a Guessing Wiretapper: General Sources , 2011, IEEE Transactions on Information Theory.
[27] Vincent Y. F. Tan,et al. Analysis of Remaining Uncertainties and Exponents Under Various Conditional Rényi Entropies , 2016, IEEE Transactions on Information Theory.
[28] Harry Joe,et al. Majorization, entropy and paired comparisons , 1988 .
[29] Rajesh Sundaresan,et al. DRDO – IISc Programme on Advanced Research in Mathematical Engineering Guessing Based On Length Functions ( TR-PME-2007-02 ) by , 2007 .
[30] Sergio Verdú,et al. Improved Bounds on Lossless Source Coding and Guessing Moments via Rényi Measures , 2018, IEEE Transactions on Information Theory.
[31] Ofer Shayevitz,et al. Reducing Guesswork via an Unreliable Oracle , 2018, IEEE Transactions on Information Theory.
[32] Sergio Verdú,et al. Convexity/concavity of renyi entropy and α-mutual information , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[33] Marco Dalai,et al. Lower Bounds on the Probability of Error for Classical and Classical-Quantum Channels , 2012, IEEE Transactions on Information Theory.
[34] B. Arnold,et al. Majorization and the Lorenz Order with Applications in Applied Mathematics and Economics , 2018 .
[35] Frederick Jelinek,et al. On variable-length-to-block coding , 1972, IEEE Trans. Inf. Theory.
[36] Ofer Shayevitz,et al. On Rényi measures and hypothesis testing , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[37] Holger Boche,et al. Majorization and Matrix-Monotone Functions in Wireless Communications , 2007, Found. Trends Commun. Inf. Theory.
[38] David S. Johnson,et al. Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .
[39] J. Massey. Guessing and entropy , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[40] Thomas M. Cover,et al. Elements of information theory (2. ed.) , 2006 .
[41] Hazer Inaltekin,et al. Optimality of Binary Power Control for the Single Cell Uplink , 2012, IEEE Transactions on Information Theory.
[42] B. Arnold. Majorization: Here, There and Everywhere , 2007, 0801.4221.
[43] Amos Lapidoth,et al. Distributed task encoding , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).
[44] Ugo Vaccaro,et al. Bounding the average length of optimal source codes via majorization theory , 2004, IEEE Transactions on Information Theory.
[45] Vincent Y. F. Tan,et al. Rényi Resolvability and Its Applications to the Wiretap Channel , 2019, IEEE Trans. Inf. Theory.
[46] Hans S. Witsenhausen. Some aspects of convexity useful in information theory , 1980, IEEE Trans. Inf. Theory.
[47] Sergio Verdú,et al. Cumulant generating function of codeword lengths in optimal lossless compression , 2014, 2014 IEEE International Symposium on Information Theory.
[48] Harry Joe,et al. Majorization and divergence , 1990 .
[49] Thomas M. Cover,et al. Elements of Information Theory: Cover/Elements of Information Theory, Second Edition , 2005 .
[50] Ken R. Duffy,et al. Guesswork, Large Deviations, and Shannon Entropy , 2012, IEEE Transactions on Information Theory.
[51] Slavko Simic. Jensen's inequality and new entropy bounds , 2009, Appl. Math. Lett..
[52] Alfredo De Santis,et al. Bounds on entropy in a guessing game , 2001, IEEE Trans. Inf. Theory.
[53] Venkat Anantharam,et al. Optimal sequences for CDMA under colored noise: A Schur-saddle function property , 2002, IEEE Trans. Inf. Theory.
[54] Hiroki Koga. Characterization of the smooth Rényi Entropy Using Majorization , 2013, 2013 IEEE Information Theory Workshop (ITW).
[55] Yi Jiang,et al. MIMO Transceiver Design via Majorization Theory , 2007, Found. Trends Commun. Inf. Theory.
[56] Amos Lapidoth,et al. Encoding Tasks and Rényi Entropy , 2014, IEEE Transactions on Information Theory.
[57] C. E. Pfister,et al. Renyi entropy, guesswork moments, and large deviations , 2004, IEEE Transactions on Information Theory.
[58] Peter Harremoës,et al. Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.
[59] L. L. Campbell,et al. A Coding Theorem and Rényi's Entropy , 1965, Inf. Control..
[60] Ugo Vaccaro,et al. Maximum Entropy Interval Aggregations , 2018, 2018 IEEE International Symposium on Information Theory (ISIT).
[61] Amos Lapidoth,et al. Guessing Attacks on Distributed-Storage Systems , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[62] John Thompson,et al. Cooperative Communications for improved wireless network transmission: framework for virtual antenna array applications , 2009 .
[63] Vincent Y. F. Tan,et al. Rényi Resolvability and Its Applications to the Wiretap Channel , 2017, IEEE Transactions on Information Theory.
[64] Jaikumar Radhakrishnan,et al. The Communication Complexity of Correlation , 2007, Twenty-Second Annual IEEE Conference on Computational Complexity (CCC'07).
[65] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[66] Rajesh Sundaresan,et al. Guessing Revisited: A Large Deviations Approach , 2010, IEEE Transactions on Information Theory.
[67] J. Steele. The Cauchy–Schwarz Master Class: References , 2004 .
[68] Shigeaki Kuzuoka. On the smooth Rényi entropy and variable-length source coding allowing errors , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[69] Vincent Y. F. Tan,et al. Equivocations, Exponents, and Second-Order Coding Rates Under Various Rényi Information Measures , 2017, IEEE Transactions on Information Theory.
[70] Neri Merhav,et al. The Shannon cipher system with a guessing wiretapper , 1999, IEEE Trans. Inf. Theory.
[71] Muriel Médard,et al. Centralized vs decentralized multi-agent guesswork , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).
[72] Erdal Arikan. An inequality on guessing and its application to sequential decoding , 1996, IEEE Trans. Inf. Theory.
[73] Shigeaki Kuzuoka,et al. On the Conditional Smooth Rényi Entropy and its Applications in Guessing and Source Coding , 2018, IEEE Transactions on Information Theory.
[74] Rajesh Sundaresan,et al. Guessing Under Source Uncertainty , 2006, IEEE Transactions on Information Theory.
[75] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[76] S. Verdú,et al. Arimoto channel coding converse and Rényi divergence , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[77] Vincent Y. F. Tan,et al. Wyner’s Common Information Under Rényi Divergence Measures , 2017, IEEE Transactions on Information Theory.
[78] Shuhong Wang,et al. Schur-Convexity on Generalized Information Entropy and Its Applications , 2011, ICICA.
[79] Sergio Verdú,et al. Arimoto–Rényi Conditional Entropy and Bayesian $M$ -Ary Hypothesis Testing , 2017, IEEE Transactions on Information Theory.
[80] Charles R. Johnson,et al. Matrix Analysis, 2nd Ed , 2012 .
[81] Himanshu Tyagi,et al. Coding theorems using Rényi information measures , 2017, 2017 Twenty-third National Conference on Communications (NCC).
[82] David Tse,et al. Optimal sequences, power control, and user capacity of synchronous CDMA systems with linear MMSE multiuser receivers , 1999, IEEE Trans. Inf. Theory.
[83] M. Ben-Bassat,et al. Renyi's entropy and the probability of error , 1978, IEEE Trans. Inf. Theory.
[84] Ugo Vaccaro,et al. Bounds on the Entropy of a Function of a Random Variable and Their Applications , 2017, IEEE Transactions on Information Theory.
[85] I. Csiszár. Generalized Cutoff Rates and Renyi's Information Measures , 1993, Proceedings. IEEE International Symposium on Information Theory.