Quantifying multivariate redundancy with maximum entropy decompositions of mutual information
暂无分享,去创建一个
[1] Randall D. Beer,et al. Information Processing and Dynamics in Minimally Cognitive Agents , 2015, Cogn. Sci..
[2] P. Latham,et al. Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior , 2017, Neuron.
[3] J. Macke,et al. Neural population coding: combining insights from microscopic and mass signals , 2015, Trends in Cognitive Sciences.
[4] Luca Faes,et al. Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes , 2017, Entropy.
[5] Nicolas Brunel,et al. Sensory neural codes using multiplexed temporal scales , 2010, Trends in Neurosciences.
[6] Marian Verhelst,et al. Understanding Interdependency Through Complex Information Sharing , 2015, Entropy.
[7] Nihat Ay,et al. Robustness, canalyzing functions and systems design , 2012, Theory in Biosciences.
[8] Johannes Rauh,et al. Secret Sharing and Shared Information , 2017, Entropy.
[9] Peter M. A. Sloot,et al. Quantifying Synergistic Information Using Intermediate Stochastic Variables , 2016, Entropy.
[10] Eckehard Olbrich,et al. Quantifying unique information , 2013, Entropy.
[11] Michael J. Berry,et al. Network information and connected correlations. , 2003, Physical review letters.
[12] E T Rolls,et al. Correlations and the encoding of information in the nervous system , 1999, Proceedings of the Royal Society of London. Series B: Biological Sciences.
[13] James P. Crutchfield,et al. Multivariate Dependence Beyond Shannon Information , 2016, Entropy.
[14] Larissa Albantakis,et al. From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0 , 2014, PLoS Comput. Biol..
[15] Prantik Chatterjee,et al. Construction of synergy networks from gene expression data related to disease. , 2016, Gene.
[16] Daniel Chicharro,et al. A Causal Perspective on the Analysis of Signal and Noise Correlations and Their Role in Population Coding , 2014, Neural Computation.
[17] A. J. Bell. THE CO-INFORMATION LATTICE , 2003 .
[18] A. Ledberg,et al. Framework to study dynamic dependencies in networks of interacting processes. , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.
[19] Robin A. A. Ince. The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal , 2017, ArXiv.
[20] Virgil Griffith,et al. Synergy, Redundancy and Common Information , 2015, ArXiv.
[21] Mikhail Prokopenko,et al. Differentiating information transfer and causal effect , 2008, 0812.4373.
[22] Eckehard Olbrich,et al. On extractable shared information , 2017, Entropy.
[23] Schreiber,et al. Measuring information transfer , 2000, Physical review letters.
[24] P. Latham,et al. Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.
[25] Joseph T. Lizier,et al. Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).
[26] 최장석,et al. 25 , 1993, Magical Realism for Non-Believers.
[27] G. Tononi,et al. Rethinking segregation and integration: contributions of whole-brain modelling , 2015, Nature Reviews Neuroscience.
[28] Joseph T. Lizier,et al. Directed Information Measures in Neuroscience , 2014 .
[29] Adam B. Barrett,et al. An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.
[30] Daniel Polani,et al. Information Theory of Decisions and Actions , 2011 .
[31] Paul L. Williams,et al. Information dynamics: Its theory and application to embodied cognitive systems , 2011 .
[32] E. Davidson,et al. The evolution of hierarchical gene regulatory networks , 2009, Nature Reviews Genetics.
[33] A. Ledberg,et al. When two become one: the limits of causality analysis of brain dynamics. , 2012, PloS one.
[34] James P. Crutchfield,et al. Intersection Information Based on Common Randomness , 2013, Entropy.
[35] Jim Kay,et al. Partial information decomposition as a unified approach to the specification of neural goal functions , 2015, Brain and Cognition.
[36] Robin A. A. Ince. Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.
[37] Bryan C. Daniels,et al. Quantifying collectivity , 2016, Current Opinion in Neurobiology.
[38] Eckehard Olbrich,et al. Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.
[39] Michael J. Berry,et al. Predictive information in a sensory population , 2013, Proceedings of the National Academy of Sciences.
[40] IEEE Symposium on Artificial Life, ALife 2013, Singapore, April 16-19, 2013 , 2013, ALIFE.
[41] Y. R. Venturini,et al. Phd , 2009, AINA.
[42] H. Marko,et al. The Bidirectional Communication Theory - A Generalization of Information Theory , 1973, IEEE Transactions on Communications.
[43] Richard M. Everson,et al. Independent Components Analysis , 2000, Artificial Neural Networks in Biomedicine.
[44] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[45] Benjamin Flecker,et al. Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective , 2014, Journal of Computational Neuroscience.
[46] Eckehard Olbrich,et al. Information Decomposition and Synergy , 2015, Entropy.
[47] Randall D. Beer,et al. Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.
[48] Ralf Der,et al. Information-driven self-organization: the dynamical system approach to autonomous robot behavior , 2011, Theory in Biosciences.
[49] Randall D. Beer,et al. Generalized Measures of Information Transfer , 2011, ArXiv.
[50] Michael J. Berry,et al. Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.
[51] Daniel Chicharro,et al. Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables , 2017, Entropy.
[52] N. Logothetis,et al. On the use of information theory for the analysis of the relationship between neural and imaging signals. , 2008, Magnetic resonance imaging.
[53] Daniel Chicharro,et al. Algorithms of causal inference for the analysis of effective connectivity among brain regions , 2014, Front. Neuroinform..
[54] David J. Field,et al. Sparse coding with an overcomplete basis set: A strategy employed by V1? , 1997, Vision Research.
[55] J. Hogg. Magnetic resonance imaging. , 1994, Journal of the Royal Naval Medical Service.
[56] William J. McGill. Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.
[57] Tian Zheng,et al. Inference of Regulatory Gene Interactions from Expression Data Using Three‐Way Mutual Information , 2009, Annals of the New York Academy of Sciences.
[58] Luca Faes,et al. An Information-Theoretic Framework to Map the Spatiotemporal Dynamics of the Scalp Electroencephalogram , 2016, IEEE Transactions on Biomedical Engineering.
[59] H Barlow,et al. Redundancy reduction revisited , 2001, Network.
[60] W. Marsden. I and J , 2012 .
[61] Nihat Ay,et al. Information Geometry on Complexity and Stochastic Interaction , 2015, Entropy.
[62] Sami El Boustani,et al. Prediction of spatiotemporal patterns of neural activity from pairwise correlations. , 2009, Physical review letters.
[63] Shun-ichi Amari,et al. Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.
[64] Daniel Polani,et al. Information Flows in Causal Networks , 2008, Adv. Complex Syst..
[65] Nihat Ay,et al. Hierarchical Quantification of Synergy in Channels , 2016, Front. Robot. AI.
[66] Christof Koch,et al. Quantifying synergistic mutual information , 2012, ArXiv.
[67] Jochen Triesch,et al. Learning independent causes in natural images explains the spacevariant oblique effect , 2009, 2009 IEEE 8th International Conference on Development and Learning.
[68] Eckehard Olbrich,et al. Reconsidering unique information: Towards a multivariate information decomposition , 2014, 2014 IEEE International Symposium on Information Theory.
[69] Luca Faes,et al. Estimating the decomposition of predictive information in multivariate systems. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.
[70] Naftali Tishby,et al. The information bottleneck method , 2000, ArXiv.
[71] David R. Anderson,et al. Model selection and multimodel inference : a practical information-theoretic approach , 2003 .
[72] Christoph Salge,et al. A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.
[73] Stefano Panzeri,et al. Information-theoretic methods for studying population codes , 2010, Neural Networks.
[74] Maxym Myroshnychenko,et al. High-Degree Neurons Feed Cortical Computations , 2016, PLoS Comput. Biol..
[75] Shun-ichi Amari,et al. Information Geometry and Its Applications , 2016 .
[76] Karl J. Friston,et al. Effective connectivity: Influence, causality and biophysical modeling , 2011, NeuroImage.
[77] Daniel Chicharro,et al. Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss , 2016, Entropy.
[78] A. Pouget,et al. Neural correlations, population coding and computation , 2006, Nature Reviews Neuroscience.