Generalization of the de Bruijn Identity to General $\phi$ -Entropies and $\phi$ -Fisher Informations

In this paper, we propose generalizations of the de Bruijn identity based on extensions of the Shannon entropy, Fisher information and their associated divergences or relative measures. The foundations of these generalizations are the <inline-formula> <tex-math notation="LaTeX">$\phi $ </tex-math></inline-formula>-entropies and divergences of the Csiszár (or Salicrú) class considered within a multidimensional context, including the one-dimensional case, and for several types of noisy channels characterized by a more general probability distribution beyond the well-known Gaussian noise. We found that the gradient and/or the Hessian of these entropies or divergences with respect to the noise parameter naturally give rise to generalized versions of the Fisher information or divergence, which are named the <inline-formula> <tex-math notation="LaTeX">$\phi $ </tex-math></inline-formula>-Fisher information (divergence). The obtained identities can be viewed as further extensions of the classical de Bruijn identity. Analogously, it is shown that a similar relation holds between the <inline-formula> <tex-math notation="LaTeX">$\phi $ </tex-math></inline-formula>-divergence and an extended mean-square error, named <inline-formula> <tex-math notation="LaTeX">$\phi $ </tex-math></inline-formula>-mean square error, for the Gaussian channel.

[1]  C. Tsallis Nonextensive statistics: theoretical, experimental and computational evidences and connections , 1999, cond-mat/9903356.

[2]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[3]  S. Kay Fundamentals of statistical signal processing: estimation theory , 1993 .

[4]  Alfred O. Hero,et al.  Applications of entropic spanning graphs , 2002, IEEE Signal Process. Mag..

[5]  J.-F. Bercher,et al.  On multidimensional generalized Cramér–Rao inequalities, uncertainty relations and characterizations of generalized q-Gaussian distributions , 2012, 1211.2008.

[6]  A. Krall Applied Analysis , 1986 .

[7]  S. Sharma,et al.  The Fokker-Planck Equation , 2010 .

[8]  Erwin Lutwak,et al.  Extensions of Fisher Information and Stam's Inequality , 2012, IEEE Transactions on Information Theory.

[9]  A. Plastino,et al.  GENERALIZED ENTROPY AS A MEASURE OF QUANTUM UNCERTAINTY , 1996 .

[10]  William E. Strawderman,et al.  The Heat Equation and Stein's Identity: Connections, Applications , 2006 .

[11]  Amir Dembo,et al.  Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.

[12]  Pranesh Kumar,et al.  A symmetric information divergence measure of the Csiszár's f-divergence class and its bounds , 2005 .

[13]  Miquel Salicrú Pagés,et al.  Funciones de entropía asociadas a medidas de Csiszar , 1987 .

[14]  Hicham m. Théorie de l'information et du codage , 2012 .

[15]  Daniel Pérez Palomar,et al.  Gradient of mutual information in linear vector Gaussian channels , 2006, IEEE Transactions on Information Theory.

[16]  M. Taqqu,et al.  Stable Non-Gaussian Random Processes : Stochastic Models with Infinite Variance , 1995 .

[17]  A. Barron ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .

[18]  Werner Ebeling,et al.  Entropy, complexity, predictability, and data analysis of time series and letter sequences , 2002 .

[19]  B. Roy Frieden,et al.  Science from Fisher Information: A Unification , 2004 .

[20]  Mokshay Madiman,et al.  The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities , 2006, 2006 IEEE International Symposium on Information Theory.

[21]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[22]  Shlomo Shamai,et al.  Additive non-Gaussian noise channels: mutual information and conditional mean estimation , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[23]  Dongning Guo,et al.  Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation , 2009, 2009 IEEE International Symposium on Information Theory.

[24]  Erchin Serpedin,et al.  New perspectives, extensions and applications of de Bruijn identity , 2012, 2012 IEEE 13th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[25]  O. Johnson Information Theory And The Central Limit Theorem , 2004 .

[26]  Shun-ichi Amari,et al.  Methods of information geometry , 2000 .

[27]  Erwin Lutwak,et al.  Moment-Entropy Inequalities for a Random Vector , 2007, IEEE Transactions on Information Theory.

[28]  D. Applebaum Stable non-Gaussian random processes , 1995, The Mathematical Gazette.

[29]  C. Vignat,et al.  Some results concerning maximum Renyi entropy distributions , 2005, math/0507400.

[30]  S. M. Ali,et al.  A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .

[31]  A. R. Plastino,et al.  Tsallis-like information measures and the analysis of complex signals , 2000 .

[32]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[33]  Jean-François Bercher,et al.  Analysis of signals in the Fisher–Shannon information plane , 2003 .

[34]  Jean-François Bercher,et al.  On generalized Cramér–Rao inequalities, generalized Fisher information and characterizations of generalized q-Gaussian distributions , 2012, ArXiv.

[35]  P. Jizba Information Theory and Generalized Statistics , 2003, cond-mat/0301343.

[36]  G. Kaniadakis,et al.  Non-linear kinetics underlying generalized statistics , 2001 .

[37]  G. Crooks On Measures of Entropy and Information , 2015 .

[38]  Tunc Geveci,et al.  Advanced Calculus , 2014, Nature.

[39]  J. Elgin The Fokker-Planck Equation: Methods of Solution and Applications , 1984 .

[40]  A. Barron,et al.  Fisher information inequalities and the central limit theorem , 2001, math/0111020.

[41]  C. Vignata,et al.  Analysis of signals in the Fisher – Shannon information plan , 2003 .

[42]  Michèle Basseville,et al.  Divergence measures for statistical data processing - An annotated bibliography , 2013, Signal Process..

[43]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[44]  P. Jizba,et al.  The world according to R enyi: thermodynamics of multifractal systems , 2002, cond-mat/0207707.

[45]  Pierre Comon,et al.  Information–Estimation Relationship in Mismatched Gaussian Channels , 2017, IEEE Signal Processing Letters.

[46]  Jean-François Bercher,et al.  Source coding with escort distributions and Renyi entropy bounds , 2009, ArXiv.

[47]  J. S. Dehesa,et al.  Entropy and Complexity Analyses of D-dimensional Quantum Systems , 2011 .

[48]  G. Darbellay,et al.  The entropy as a tool for analysing statistical dependences in financial time series , 2000 .

[49]  D. Harte Multifractals: Theory and Applications , 2001 .

[50]  Antonia Maria Tulino,et al.  Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.

[51]  Igor Vajda,et al.  On Divergences and Informations in Statistics and Information Theory , 2006, IEEE Transactions on Information Theory.

[52]  P. Sánchez-Moreno,et al.  Jensen divergence based on Fisher’s information , 2010, ArXiv.

[53]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[54]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[55]  Erchin Serpedin,et al.  On the Equivalence Between Stein and De Bruijn Identities , 2012, IEEE Transactions on Information Theory.

[56]  Carl D. Meyer,et al.  Matrix Analysis and Applied Linear Algebra , 2000 .

[57]  L. L. Campbell,et al.  A Coding Theorem and Rényi's Entropy , 1965, Inf. Control..

[58]  Jean-François Bercher,et al.  Some properties of generalized Fisher information in the context of nonextensive thermostatistics , 2013, ArXiv.

[59]  Jean-François Bercher,et al.  On a (β,q)-generalized Fisher information and inequalities involving q-Gaussian distributions , 2012, ArXiv.

[60]  Steeve Zozor,et al.  On Generalized Stam Inequalities and Fisher-Rényi Complexity Measures , 2017, Entropy.

[61]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[62]  R. Jackson Inequalities , 2007, Algebra for Parents.

[63]  Erwin Lutwak,et al.  Moment-entropy inequalities , 2004 .

[64]  M. Basseville Distance measures for signal processing and pattern recognition , 1989 .

[65]  Sergio Verdú,et al.  Mismatched Estimation and Relative Entropy , 2009, IEEE Transactions on Information Theory.

[66]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .