Proceedings of the Eighth Workshop on Information Theoretic Methods in Science and Engineering
暂无分享,去创建一个
[1] Jae Oh Woo,et al. A lower bound on the Rényi entropy of convolutions in the integers , 2014, 2014 IEEE International Symposium on Information Theory.
[2] H. Marko,et al. The Bidirectional Communication Theory - A Generalization of Information Theory , 1973, IEEE Transactions on Communications.
[3] Haim H. Permuter,et al. Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing , 2009, IEEE Transactions on Information Theory.
[4] Vadim A. Kaimanovich,et al. Random Walks on Discrete Groups: Boundary and Entropy , 1983 .
[5] Liyao Wang,et al. Optimal Concentration of Information Content For Log-Concave Densities , 2015, ArXiv.
[6] E. Lehmann. Testing Statistical Hypotheses , 1960 .
[7] V. Milman,et al. Geometry of Log-concave Functions and Measures , 2005 .
[8] Wojciech Szpankowski,et al. Identifying Statistical Dependence in Genomic Sequences via Mutual Information Estimates , 2007, EURASIP J. Bioinform. Syst. Biol..
[9] Mokshay M. Madiman,et al. Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information , 2012, IEEE Transactions on Information Theory.
[10] Hans S. Witsenhausen,et al. A conditional entropy bound for a pair of discrete random variables , 1975, IEEE Trans. Inf. Theory.
[11] Ioannis Kontoyiannis,et al. Estimating the Directed Information and Testing for Causality , 2015, IEEE Transactions on Information Theory.
[12] Tsachy Weissman,et al. Rate-distortion in near-linear time , 2008, 2008 IEEE International Symposium on Information Theory.
[13] Erwin Lutwak,et al. Moment-entropy inequalities , 2004 .
[14] C. Granger. Investigating causal relations by econometric models and cross-spectral methods , 1969 .
[15] Haim H. Permuter,et al. Universal Estimation of Directed Information , 2010, IEEE Transactions on Information Theory.
[16] K. Ball. Logarithmically concave functions and sections of convex sets in $R^{n}$ , 1988 .
[17] S. S. Wilks. The Large-Sample Distribution of the Likelihood Ratio for Testing Composite Hypotheses , 1938 .
[18] Sergey G. Bobkov,et al. Dimensional behaviour of entropy and information , 2011, ArXiv.
[19] Zaher Dawy,et al. Genomic analysis using methods from information theory , 2004, Information Theory Workshop.
[20] Alfred O. Hero,et al. Using Directed Information to Build Biologically Relevant Influence Networks , 2007, J. Bioinform. Comput. Biol..
[21] Mokshay M. Madiman,et al. Entropies of Weighted Sums in Cyclic Groups and an Application to Polar Codes , 2017, Entropy.
[22] A. Lapidoth,et al. On the entropy of the sum and of the difference of independent random variables , 2008, 2008 IEEE 25th Convention of Electrical and Electronics Engineers in Israel.
[23] Donald L. Iglehart,et al. Importance sampling for stochastic simulations , 1989 .
[24] C. Borell. Convex measures on locally convex spaces , 1974 .
[25] Olivier J. J. Michel,et al. The relation between Granger causality and directed information theory: a review , 2012, Entropy.
[26] Gerhard Kramer,et al. Directed information for channels with feedback , 1998 .
[27] Todd P. Coleman,et al. Estimating the directed information to infer causal relationships in ensemble neural spike train recordings , 2010, Journal of Computational Neuroscience.