Shannon Shakes Hands with Chernoff: Big Data Viewpoint On Channel Information Measures

Shannon entropy is the most crucial foundation of Information Theory, which has been proven to be effective in many fields such as communications. Renyi entropy and Chernoff information are other two popular measures of information with wide applications. The mutual information is effective to measure the channel information for the fact that it reflects the relation between output variables and input variables. In this paper, we reexamine these channel information measures in big data viewpoint by means of ACE algorithm. The simulated results show us that decomposition results of Shannon and Chernoff mutual information with respect to channel parametersare almost the same. In this sense, Shannon shakes hands with Chernoff since they are different measures of the same information quantity. We also propose a conjecture that there is nature of channel information which is only decided by the channel parameters.

[1]  Deniz Erdoğmuş,et al.  Blind source separation using Renyi's mutual information , 2001, IEEE Signal Processing Letters.

[2]  J. Friedman,et al.  Estimating Optimal Transformations for Multiple Regression and Correlation. , 1985 .

[3]  G. Crooks On Measures of Entropy and Information , 2015 .

[4]  H. Krim,et al.  Renyi entropy based divergence measures for ICA , 2004, IEEE Workshop on Statistical Signal Processing, 2003.

[5]  A. Rényi On Measures of Entropy and Information , 1961 .

[6]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[7]  Deniz Erdogmus,et al.  An analysis of entropy estimators for blind source separation , 2006, Signal Process..

[8]  Deniz Erdogmus,et al.  Information Theoretic Learning , 2005, Encyclopedia of Artificial Intelligence.

[9]  Sergio Verdú,et al.  Fifty Years of Shannon Theory , 1998, IEEE Trans. Inf. Theory.

[10]  Duolao Wang,et al.  Estimating Optimal Transformations for Multiple Regression Using the ACE Algorithm , 2004, Journal of Data Science.

[11]  V. Vapnik Estimation of Dependences Based on Empirical Data , 2006 .

[12]  Peter Harremoës,et al.  Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.