暂无分享,去创建一个
[1] Benjamin Flecker,et al. Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective , 2014, Journal of Computational Neuroscience.
[2] I︠u︡. V. Linnik,et al. Decomposition of Random Variables and Vectors , 1977 .
[3] K. Phoon,et al. Simulation of strongly non-Gaussian processes using Karhunen–Loeve expansion , 2005 .
[4] Joseph T. Lizier,et al. Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).
[5] Aaron D. Wyner,et al. The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.
[6] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[7] Michael J. Berry,et al. Network information and connected correlations. , 2003, Physical review letters.
[8] Marek Zukowski,et al. The Essence of Entanglement , 2001, Fundamental Theories of Physics.
[9] K. Karhunen. Zur Spektraltheorie stochastischer prozesse , 1946 .
[10] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[11] Viola Priesemann,et al. Bits from Brains for Biologically Inspired Computing , 2014, Front. Robot. AI.
[12] James P. Crutchfield,et al. Intersection Information Based on Common Randomness , 2013, Entropy.
[13] R. Ghanem,et al. Stochastic Finite Elements: A Spectral Approach , 1990 .
[14] Eckehard Olbrich,et al. Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.
[15] Shun-ichi Amari,et al. Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.
[16] Randall D. Beer,et al. Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.
[17] Christof Koch,et al. Quantifying synergistic mutual information , 2012, ArXiv.
[18] K. Krippendorff. Mathematical Theory of Communication , 2009 .
[19] Eckehard Olbrich,et al. Quantifying unique information , 2013, Entropy.
[20] Claude E. Shannon,et al. The Mathematical Theory of Communication , 1950 .
[21] Eckehard Olbrich,et al. Information Decomposition and Synergy , 2015, Entropy.
[22] H. Witsenhausen. Values and Bounds for the Common Information of Two Discrete Random Variables , 1976 .
[23] Wei Liu,et al. Wyners common information for continuous random variables - A lossy source coding interpretation , 2011, 2011 45th Annual Conference on Information Sciences and Systems.
[24] Michael J. Berry,et al. Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.
[25] Michael Satosi Watanabe,et al. Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..
[26] P. Latham,et al. Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.