MAXENT3D_PID: An Estimator for the Maximum-Entropy Trivariate Partial Information Decomposition
暂无分享,去创建一个
Daniel Chicharro | Dirk Oliver Theis | Abdullah Makkeh | Raul Vicente | D. Theis | Raul Vicente | D. Chicharro | A. Makkeh | Abdullah Makkeh
[1] G. Tononi,et al. Rethinking segregation and integration: contributions of whole-brain modelling , 2015, Nature Reviews Neuroscience.
[2] Viola Priesemann,et al. Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition , 2017, Entropy.
[3] Guido Montúfar,et al. The Variational Deficiency Bottleneck , 2020, 2020 International Joint Conference on Neural Networks (IJCNN).
[4] David J. Field,et al. Sparse coding with an overcomplete basis set: A strategy employed by V1? , 1997, Vision Research.
[5] Ralph Linsker,et al. Local Synaptic Learning Rules Suffice to Maximize Mutual Information in a Linear Network , 1992, Neural Computation.
[6] Rob Brekelmans,et al. Disentangled representations via synergy minimization , 2017, 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[7] Raul Vicente,et al. Transfer Entropy in Neuroscience , 2014 .
[8] Dirk Oliver Theis,et al. Bivariate Partial Information Decomposition: The Optimization Perspective , 2017, Entropy.
[9] James P. Crutchfield,et al. dit: a Python package for discrete information theory , 2018, J. Open Source Softw..
[10] Christoph Salge,et al. A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.
[11] A. Ledberg,et al. Framework to study dynamic dependencies in networks of interacting processes. , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.
[12] Adam B. Barrett,et al. An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.
[13] Johannes Rauh,et al. Secret Sharing and Shared Information , 2017, Entropy.
[14] David R. Anderson,et al. Model selection and multimodel inference : a practical information-theoretic approach , 2003 .
[15] Christof Koch,et al. Quantifying synergistic mutual information , 2012, ArXiv.
[16] Jie Sun,et al. Identifying the Coupling Structure in Complex Systems through the Optimal Causation Entropy Principle , 2014, Entropy.
[17] Seth Frey,et al. Synergistic Information Processing Encrypts Strategic Reasoning in Poker , 2018, Cogn. Sci..
[18] Tian Zheng,et al. Inference of Regulatory Gene Interactions from Expression Data Using Three‐Way Mutual Information , 2009, Annals of the New York Academy of Sciences.
[19] Nihat Ay,et al. Robustness, canalyzing functions and systems design , 2012, Theory in Biosciences.
[20] Dirk Oliver Theis,et al. Analyzing Information Distribution in Complex Systems , 2017, Entropy.
[21] Abdullah Makkeh. Applications of optimization in some complex systems , 2018 .
[22] Luca Faes,et al. Synergetic and redundant information flow detected by unnormalized Granger causality: application to resting state fMRI. , 2015, IEEE transactions on bio-medical engineering.
[23] Eckehard Olbrich,et al. On extractable shared information , 2017, Entropy.
[24] D. Anastassiou. Computational analysis of the synergy among multiple interacting genes , 2007, Molecular systems biology.
[25] Daniel Chicharro,et al. The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy , 2017, Entropy.
[26] Nihat Ay,et al. Hierarchical Quantification of Synergy in Channels , 2016, Front. Robot. AI.
[27] Ralf Der,et al. Guided self-organization: perception–action loops of embodied systems , 2011, Theory in Biosciences.
[28] Joseph T. Lizier,et al. Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).
[29] Terrence J. Sejnowski,et al. An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.
[30] Luca Faes,et al. Estimating the decomposition of predictive information in multivariate systems. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.
[31] James P. Crutchfield,et al. Unique information via dependency constraints , 2017, Journal of Physics A: Mathematical and Theoretical.
[32] Stephen J. Wright,et al. Primal-Dual Interior-Point Methods , 1997 .
[33] Joseph T. Lizier,et al. Pointwise Partial Information DecompositionUsing the Specificity and Ambiguity Lattices , 2018, Entropy.
[34] I. Couzin,et al. Inferring the structure and dynamics of interactions in schooling fish , 2011, Proceedings of the National Academy of Sciences.
[35] Joseph T. Lizier,et al. Directed Information Measures in Neuroscience , 2014 .
[36] Schreiber,et al. Measuring information transfer , 2000, Physical review letters.
[37] Gordon Pipa,et al. Transfer entropy—a model-free measure of effective connectivity for the neurosciences , 2010, Journal of Computational Neuroscience.
[38] James P. Crutchfield,et al. A Perspective on Unique Information: Directionality, Intuitions, and Secret Key Agreement , 2018, ArXiv.
[39] Luca Faes,et al. An Information-Theoretic Framework to Map the Spatiotemporal Dynamics of the Scalp Electroencephalogram , 2016, IEEE Transactions on Biomedical Engineering.
[40] Guy Theraulaz,et al. Informative and misinformative interactions in a school of fish , 2017, Swarm Intelligence.
[41] Sanjay Mehrotra,et al. On the Implementation of a Primal-Dual Interior Point Method , 1992, SIAM J. Optim..
[42] Michael J. Berry,et al. Network information and connected correlations. , 2003, Physical review letters.
[43] Daniel Chicharro,et al. Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss , 2016, Entropy.
[44] Robin A. A. Ince. Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.
[45] Karl J. Friston,et al. Effective connectivity: Influence, causality and biophysical modeling , 2011, NeuroImage.
[46] Sami El Boustani,et al. Prediction of spatiotemporal patterns of neural activity from pairwise correlations. , 2009, Physical review letters.
[47] Eckehard Olbrich,et al. Information Decomposition and Synergy , 2015, Entropy.
[48] Murray Shanahan,et al. The Partial Information Decomposition of Generative Neural Network Models , 2017, Entropy.
[49] Daniel Chicharro,et al. Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables , 2017, Entropy.
[50] Michael J. Berry,et al. Predictive information in a sensory population , 2013, Proceedings of the National Academy of Sciences.
[51] Paul L. Williams,et al. Information dynamics: Its theory and application to embodied cognitive systems , 2011 .
[52] Jim Kay,et al. Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints , 2018, Entropy.
[53] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[54] Naftali Tishby,et al. Opening the Black Box of Deep Neural Networks via Information , 2017, ArXiv.
[55] Prantik Chatterjee,et al. Construction of synergy networks from gene expression data related to disease. , 2016, Gene.
[56] Eckehard Olbrich,et al. Quantifying unique information , 2013, Entropy.
[57] P. Latham,et al. Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.
[58] Daniel Chicharro,et al. Quantifying multivariate redundancy with maximum entropy decompositions of mutual information , 2017, 1708.03845.
[59] Jim Kay,et al. Partial information decomposition as a unified approach to the specification of neural goal functions , 2015, Brain and Cognition.
[60] Raul Vicente,et al. Efficient Estimation of Information Transfer , 2014 .
[61] E. Davidson,et al. The evolution of hierarchical gene regulatory networks , 2009, Nature Reviews Genetics.
[62] Bryan C. Daniels,et al. Quantifying collectivity , 2016, Current Opinion in Neurobiology.
[63] Benjamin Flecker,et al. Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective , 2014, Journal of Computational Neuroscience.
[64] Guido Montúfar,et al. Computing the Unique Information , 2017, 2018 IEEE International Symposium on Information Theory (ISIT).
[65] Stephen P. Boyd,et al. ECOS: An SOCP solver for embedded systems , 2013, 2013 European Control Conference (ECC).
[66] Dirk Oliver Theis,et al. BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition , 2018, Entropy.
[67] Eckehard Olbrich,et al. Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.
[68] Keyan Zahedi,et al. Morphological Computation: Synergy of Body and Brain , 2017, Entropy.
[69] Shun-ichi Amari,et al. Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.
[70] K. Hlavácková-Schindler,et al. Causality detection based on information-theoretic approaches in time series analysis , 2007 .
[71] Stefano Panzeri,et al. Quantifying how much sensory information in a neural code is relevant for behavior , 2017, NIPS.
[72] Joseph T. Lizier,et al. JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems , 2014, Front. Robot. AI.
[73] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[74] Ralf Der,et al. Information-driven self-organization: the dynamical system approach to autonomous robot behavior , 2011, Theory in Biosciences.
[75] Viola Priesemann,et al. Bits from Brains for Biologically Inspired Computing , 2014, Front. Robot. AI.
[76] Naftali Tishby,et al. The information bottleneck method , 2000, ArXiv.
[77] James P. Crutchfield,et al. Multivariate Dependence Beyond Shannon Information , 2016, Entropy.
[78] Jessica C. Flack,et al. Multiple time-scales and the developmental dynamics of social systems , 2012, Philosophical Transactions of the Royal Society B: Biological Sciences.
[79] Luca Faes,et al. Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes , 2017, Entropy.
[80] Virgil Griffith,et al. Synergy, Redundancy and Common Information , 2015, ArXiv.
[81] Randall D. Beer,et al. Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.
[82] Michael J. Berry,et al. Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.