Measuring Integrated Information: Comparison of Candidate Measures in Theory and Simulation
暂无分享,去创建一个
[1] C. Granger. Investigating Causal Relations by Econometric Models and Cross-Spectral Methods , 1969 .
[2] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[3] Shlomo Shamai,et al. On information rates for mismatched decoders , 1994, IEEE Trans. Inf. Theory.
[4] G. Edelman,et al. A measure for brain complexity: relating functional segregation and integration in the nervous system. , 1994, Proceedings of the National Academy of Sciences of the United States of America.
[5] Shun-ichi Amari,et al. Methods of information geometry , 2000 .
[6] Peter Dayan,et al. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems , 2001 .
[7] C. Granger. Investigating causal relations by econometric models and cross-spectral methods , 1969 .
[8] Olaf Sporns,et al. Measuring information integration , 2003, BMC Neuroscience.
[9] Michael J. Berry,et al. Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.
[10] A. Kraskov,et al. Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.
[11] A. Seth. Causal connectivity of evolved neural networks during behavior. , 2005, Network.
[12] P. Latham,et al. Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.
[13] J. Urry. Complexity , 2006, Interpreting Art.
[14] Stephen P. Boyd,et al. Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.
[15] Anil K Seth,et al. Theories and measures of consciousness: an extended framework. , 2006, Proceedings of the National Academy of Sciences of the United States of America.
[16] Helmut Ltkepohl,et al. New Introduction to Multiple Time Series Analysis , 2007 .
[17] Giulio Tononi,et al. Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework , 2008, PLoS Comput. Biol..
[18] K. Gurney,et al. Network ‘Small-World-Ness’: A Quantitative Method for Determining Canonical Network Equivalence , 2008, PloS one.
[19] Qing Wang,et al. Divergence Estimation for Multidimensional Densities Via $k$-Nearest-Neighbor Distances , 2009, IEEE Transactions on Information Theory.
[20] A. Seth,et al. Granger causality and transfer entropy are equivalent for Gaussian variables. , 2009, Physical review letters.
[21] S. Amari. Information geometry in optimization, machine learning and statistical inference , 2010 .
[22] Randall D. Beer,et al. Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.
[23] Kazuya Ishibashi,et al. Mismatched Decoding in the Brain , 2010, The Journal of Neuroscience.
[24] Jakob Heinzle,et al. Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity , 2010, Journal of Computational Neuroscience.
[25] Viola Priesemann,et al. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy , 2011, BMC Neuroscience.
[26] N. Czakon,et al. X‐ray, lensing and Sunyaev–Zel'dovich triaxial analysis of Abell 1835 out to R200 , 2011, 1111.6189.
[27] A. Seth,et al. Causal density and integrated information as measures of conscious level , 2011, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.
[28] A. Seth,et al. Behaviour of Granger causality under filtering: Theoretical invariance and practical application , 2011, Journal of Neuroscience Methods.
[29] Anil K. Seth,et al. Practical Measures of Integrated Information for Time-Series Data , 2011, PLoS Comput. Biol..
[30] Eckehard Olbrich,et al. Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.
[31] Christof Koch,et al. Quantifying synergistic mutual information , 2012, ArXiv.
[32] Karoline Wiesner,et al. Information-theoretic lower bound on energy cost of stochastic computation , 2011, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.
[33] Paul F. M. J. Verschure,et al. Integrated information for large complex networks , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).
[34] Adam B. Barrett,et al. Granger causality is designed to measure effect, not mechanism , 2013, Front. Neuroinform..
[35] Eckehard Olbrich,et al. Quantifying unique information , 2013, Entropy.
[36] Anil K. Seth,et al. The MVGC multivariate Granger causality toolbox: A new approach to Granger-causal inference , 2014, Journal of Neuroscience Methods.
[37] Larissa Albantakis,et al. From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0 , 2014, PLoS Comput. Biol..
[38] Virgil Griffith. A Principled Infotheoretic \phi-like Measure , 2014 .
[39] J. Holland. Complexity: A Very Short Introduction , 2014 .
[40] Michael A. Cerullo,et al. The Problem with Phi: A Critique of Integrated Information Theory , 2015, PLoS Comput. Biol..
[41] Adam B. Barrett,et al. An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.
[42] Nihat Ay,et al. Information Geometry on Complexity and Stochastic Interaction , 2015, Entropy.
[43] Max Tegmark,et al. Improved Measures of Integrated Information , 2016, PLoS Comput. Biol..
[44] Shun-ichi Amari,et al. Unified framework for information integration based on information geometry , 2015, Proceedings of the National Academy of Sciences.
[45] Marian Verhelst,et al. Understanding Interdependency Through Complex Information Sharing , 2015, Entropy.
[46] Toru Yanagawa,et al. Measuring Integrated Information from the Decoding Perspective , 2015, PLoS Comput. Biol..
[47] Friedrich Sommer,et al. Moving Past the Minimum Information Partition: How To Quickly and Accurately Calculate Integrated Information , 2016, 1605.01096.
[48] Juan Carlos Farah,et al. Integrated Information and Metastability in Systems of Coupled Oscillators , 2016, 1606.08313.
[49] Joshua A. Grochow,et al. Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines , 2017, Entropy.
[50] Dirk Ostwald,et al. Computing integrated information , 2016, Neuroscience of consciousness.
[51] Murray Shanahan,et al. Balanced Information Storage and Transfer in Modular Spiking Neural Networks , 2017, 1708.04392.
[52] Friedrich T. Sommer,et al. Great Than The Sum: Integrated Information In Large Brain Networks , 2017, 1708.02967.
[53] E. Tagliazucchi. The signatures of conscious access and its phenomenology are consistent with large-scale brain communication at criticality , 2017, Consciousness and Cognition.
[54] Robin A. A. Ince. Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.
[55] Masafumi Oizumi,et al. Fast and exact search for the partition with minimal information loss , 2017, PloS one.
[56] Jim Kay,et al. Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints , 2018, Entropy.
[57] Jure Leskovec,et al. Higher-order clustering in networks , 2017, Physical review. E.
[58] Daniel Toker,et al. Information integration in large brain networks , 2017, PLoS Comput. Biol..