On Generalized Information Measures and Their Applications

Publisher Summary This chapter discusses various generalized entropies and generalized divergence measures. The focus is on the generalizations of Shannon's entropy, directed divergence, J-divergence, and information radius. The applications of these generalized information measures to statistical pattern recognition are discussed in the chapter. These studies have been conducted on the generalized information measures written in unified expressions. Information theory studies all theoretical problems connected with the transmission of information over communication channels. This includes the study of uncertainty (information) measure and various practical and economical methods of coding information for transmission. A key feature of Shannon information theory is that the term information can often be given a mathematical meaning as a numerically measurable quantity, on the basis of a probabilistic model, in such a way that the solutions of many important problems of information storage and transmission can be formulated in terms of this measure of the amount of information. This important measure has a very concrete operational interpretation. It roughly equals the minimum number of binary digits needed, on the average, to encode the message in question.

[1]  M. Ben-Bassat,et al.  Renyi's entropy and the probability of error , 1978, IEEE Trans. Inf. Theory.

[2]  J. T. Chu,et al.  Inequalities between information measures and error probability , 1966 .

[3]  Inder Jeet Taneja,et al.  Trigonometric entropies, Jensen difference divergence measures, and error bounds , 1985, Inf. Sci..

[4]  P. Devijver Entropies of degree β and lower bounds for the average error rate , 1977 .

[5]  L. L. Campbell,et al.  The relation between information theory and the differential geometry approach to statistics , 1985, Inf. Sci..

[6]  Inder Jeet Taneja,et al.  Generalized error bounds in pattern recognition , 1985, Pattern Recognit. Lett..

[7]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[8]  T. Kailath The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .

[9]  Yasuichi Horibe,et al.  Entropy and correlation , 1985, IEEE Transactions on Systems, Man, and Cybernetics.

[10]  R. Hartley Transmission of information , 1928 .

[11]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[12]  C. H. Chen,et al.  On information and distance measures, error bounds, and feature selection , 1976, Inf. Sci..

[13]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[14]  Pierre A. Devijver,et al.  On a New Class of Bounds on Bayes Risk in Multihypothesis Pattern Recognition , 1974, IEEE Transactions on Computers.

[15]  Robert G. Gallager,et al.  Variations on a theme by Huffman , 1978, IEEE Trans. Inf. Theory.

[16]  H. Nyquist,et al.  Certain factors affecting telegraph speed , 1924, Journal of the A.I.E.E..

[17]  I. J. Taneja STATISTICAL ASPECTS OF DIVERGENCE MEASURES , 1987 .

[18]  K. Matusita On the notion of affinity of several distributions and some of its applications , 1967 .

[19]  Inder Jeet Taneja A Short Note on the Redundancy of Degree alpha , 1986, Inf. Sci..

[20]  Yasuichi Horibe,et al.  A Note on Entropy Metrics , 1973, Inf. Control..

[21]  Inder Jeet Taneja Comments on “entropies of degree β and lower bounds for the average error rate” , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[22]  H. Jeffreys An invariant form for the prior probability in estimation problems , 1946, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[23]  Inder Jeet Taneja,et al.  Bounds on the entropy series , 1988, IEEE Trans. Inf. Theory.

[24]  Suguru Arimoto,et al.  Information-Theoretical Considerations on Estimation Problems , 1971, Inf. Control..

[25]  Silviu Guiaşu,et al.  Information theory with applications , 1977 .

[26]  Inder Jeet Taneja,et al.  Entropy of type (α, β) and other generalized measures in information theory , 1975 .

[27]  Dick E. Boekee,et al.  The R-Norm Information Measure , 1980, Inf. Control..

[28]  Prem Nath,et al.  On a Coding Theorem Connected with Rényi's Entropy , 1975, Inf. Control..

[29]  Inder Jeet Taneja,et al.  On some inequalities and generalized entropies: A unified approach , 1985 .

[30]  John C. Kieffer Variable-Length Source Coding with a Cost Depending Only on the Code Word Length , 1979, Inf. Control..

[31]  Frederick Jelinek,et al.  On variable-length-to-block coding , 1972, IEEE Trans. Inf. Theory.

[32]  Moshe Ben-Bassat,et al.  f-Entropies, probability of Error, and Feature Selection , 1978, Inf. Control..

[33]  C. R. Rao,et al.  On the convexity of some divergence measures based on entropy functions , 1982, IEEE Trans. Inf. Theory.

[34]  Calyampudi R. Rao Diversity and dissimilarity coefficients: A unified approach☆ , 1982 .

[35]  Martin E. Hellman,et al.  Probability of error, equivocation, and the Chernoff bound , 1970, IEEE Trans. Inf. Theory.

[36]  I. Csiszár A class of measures of informativity of observation channels , 1972 .

[37]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[38]  Dick E. Boekee,et al.  A generalized class of certainty and information measures , 1982, Inf. Sci..

[39]  Dick E. Boekee,et al.  Some aspects of error bounds in feature selection , 1979, Pattern Recognit..

[40]  L. L. Campbell,et al.  A Coding Theorem and Rényi's Entropy , 1965, Inf. Control..

[41]  Robert J. McEliece,et al.  The Theory of Information and Coding , 1979 .

[42]  J. Burbea The convexity with respect to Gaussian distributions of divergences of order a , 1984 .

[43]  W. J. Thron,et al.  Encyclopedia of Mathematics and its Applications. , 1982 .

[44]  Aaron D. Wyner,et al.  An Upper Bound on the Entropy Series , 1972, Inf. Control..

[45]  Dick E. Boekee,et al.  Bivariate certainty and information measures , 1987, Inf. Sci..