On Measures of Entropy and Information

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

[1]  R. Clausius,et al.  Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie , 1865 .

[2]  K. Pearson On the Criterion that a Given System of Deviations from the Probable in the Case of a Correlated System of Variables is Such that it Can be Reasonably Supposed to have Arisen from Random Sampling , 1900 .

[3]  E. Hellinger,et al.  Neue Begründung der Theorie quadratischer Formen von unendlichvielen Veränderlichen. , 1909 .

[4]  L. M. M.-T. Theory of Probability , 1929, Nature.

[5]  A. Bhattacharyya On a measure of divergence between two statistical populations defined by their probability distributions , 1943 .

[6]  S. Kakutani On Equivalence of Infinite Product Measures , 1948 .

[7]  H. Chernoff A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .

[8]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[9]  Irving John Good,et al.  Some Terminology and Notation in Information Theory , 1956 .

[10]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[11]  D. Kerridge Inaccuracy and Inference , 1961 .

[12]  Mill Johannes G.A. Van,et al.  Transmission Of Information , 1961 .

[13]  G. A. Barnard,et al.  Transmission of Information: A Statistical Theory of Communications. , 1961 .

[14]  A. Rényi On Measures of Entropy and Information , 1961 .

[15]  T. Morimoto Markov Processes and the H -Theorem , 1963 .

[16]  S. M. Ali,et al.  A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .

[17]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[18]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[19]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[20]  L. Boltzmann Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen , 1970 .

[21]  P. Arabie,et al.  Multidimensional scaling of measures of distance between partitions , 1973 .

[22]  Te Sun Han Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..

[23]  Te Sun Han,et al.  Multiple Mutual Informations and Multiple Interactions in Frequency Data , 1980, Inf. Control..

[24]  I. Vincze On the Concept and Measure of Information Contained in an Observation , 1981 .

[25]  L. L. Cam,et al.  Asymptotic Methods In Statistical Decision Theory , 1986 .

[26]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[27]  J. E. Glynn,et al.  Numerical Recipes: The Art of Scientific Computing , 1989 .

[28]  Kenneth Ward Church,et al.  Word Association Norms, Mutual Information, and Lexicography , 1989, ACL.

[29]  J. Crutchfield Information and Its Metric , 1990 .

[30]  Gilles Brassard,et al.  Experimental Quantum Cryptography , 1990, EUROCRYPT.

[31]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[32]  Jianhua Lin,et al.  Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.

[33]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[34]  G. Parmigiani Large Deviation Techniques in Decision, Simulation and Estimation , 1992 .

[35]  R. Jozsa Fidelity for Mixed Quantum States , 1994 .

[36]  Huaiyu Zhu On Information and Sufficiency , 1997 .

[37]  C. Tsallis,et al.  Information gain within nonextensive thermostatistics , 1998 .

[38]  G. Maugin THERMOSTATICS AND THERMODYNAMICS , 1999 .

[39]  Flemming Topsøe,et al.  Some inequalities for information divergence and related measures of discrimination , 2000, IEEE Trans. Inf. Theory.

[40]  Don H. Johnson,et al.  Symmetrizing the Kullback-Leibler Distance , 2001 .

[41]  Anand G. Dabak,et al.  Relations between Kullback-Leibler distance and Fisher information , 2002 .

[42]  A. J. Bell THE CO-INFORMATION LATTICE , 2003 .

[43]  Dominik Endres,et al.  A new metric for probability distributions , 2003, IEEE Transactions on Information Theory.

[44]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[45]  Yaneer Bar-Yam,et al.  Multiscale Complexity/Entropy , 2004, Adv. Complex Syst..

[46]  I. J. Taneja REFINEMENT INEQUALITIES AMONG SYMMETRIC DIVERGENCE MEASURES , 2005 .

[47]  Robert H. Shumway,et al.  Discrimination and Clustering for Multivariate Time Series , 1998 .

[48]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[49]  Sung-Hyuk Cha Comprehensive Survey on Distance/Similarity Measures between Probability Density Functions , 2007 .

[50]  T. Aaron Gulliver,et al.  Confliction of the Convexity and Metric Properties in f-Divergences , 2007, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[51]  Daniel Pérez Palomar,et al.  Lautum Information , 2008, IEEE Transactions on Information Theory.

[52]  Tsachy Weissman,et al.  The Information Lost in Erasures , 2008, IEEE Transactions on Information Theory.

[53]  Mark D. Plumbley,et al.  A measure of statistical complexity based on predictive information , 2010, ArXiv.

[54]  Alexander N. Gorban,et al.  Entropy: The Markov Ordering Approach , 2010, Entropy.

[55]  Mark D. Reid,et al.  Information, Divergence and Risk for Binary Experiments , 2009, J. Mach. Learn. Res..

[56]  James P. Crutchfield,et al.  Anatomy of a Bit: Information in a Time Series Observation , 2011, Chaos.

[57]  Frank Nielsen,et al.  A closed-form expression for the Sharma–Mittal entropy of exponential families , 2011, ArXiv.

[58]  Massimiliano Esposito,et al.  Mutual entropy production in bipartite systems , 2013, 1307.4728.