Fast and exact search for the partition with minimal information loss

In analysis of multi-component complex systems, such as neural systems, identifying groups of units that share similar functionality will aid understanding of the underlying structures of the system. To find such a grouping, it is useful to evaluate to what extent the units of the system are separable. Separability or inseparability can be evaluated by quantifying how much information would be lost if the system were partitioned into subsystems, and the interactions between the subsystems were hypothetically removed. A system of two independent subsystems are completely separable without any loss of information while a system of strongly interacted subsystems cannot be separated without a large loss of information. Among all the possible partitions of a system, the partition that minimizes the loss of information, called the Minimum Information Partition (MIP), can be considered as the optimal partition for characterizing the underlying structures of the system. Although the MIP would reveal novel characteristics of the neural system, an exhaustive search for the MIP is numerically intractable due to the combinatorial explosion of possible partitions. Here, we propose a computationally efficient search to precisely identify the MIP among all possible partitions by exploiting the submodularity of the measure of information loss, when the measure of information loss is submodular. Submodularity is a mathematical property of set functions which is analogous to convexity in continuous functions. Mutual information is one such submodular information loss function, and is a natural choice for measuring the degree of statistical dependence between paired sets of random variables. By using mutual information as a loss function, we show that the search for MIP can be performed in a practical order of computational time for a reasonably large system (N = 100 ∼ 1000). We also demonstrate that MIP search allows for the detection of underlying global structures in a network of nonlinear oscillators.

[1]  Michael J. Berry,et al.  Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.

[2]  G. Tononi Information integration: its relevance to brain function and consciousness. , 2010, Archives italiennes de biologie.

[3]  Nihat Ay,et al.  Information Geometry on Complexity and Stochastic Interaction , 2015, Entropy.

[4]  Germán Sumbre,et al.  An integrated calcium imaging processing toolbox for the analysis of neuronal population dynamics , 2017, PLoS Comput. Biol..

[5]  Vítor Lopes-dos-Santos,et al.  Detecting cell assemblies in large neuronal populations , 2013, Journal of Neuroscience Methods.

[6]  Max Tegmark,et al.  Improved Measures of Integrated Information , 2016, PLoS Comput. Biol..

[7]  Larissa Albantakis,et al.  From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0 , 2014, PLoS Comput. Biol..

[8]  M. Studený,et al.  The Multiinformation Function as a Tool for Measuring Stochastic Dependence , 1998, Learning in Graphical Models.

[9]  K. Kaneko Overview of coupled map lattices. , 1992, Chaos.

[10]  Friedrich Sommer,et al.  Moving Past the Minimum Information Partition: How To Quickly and Accurately Calculate Integrated Information , 2016, 1605.01096.

[11]  Mark D Humphries,et al.  Spike-Train Communities: Finding Groups of Similar Spike Trains , 2011, The Journal of Neuroscience.

[12]  Anil K. Seth,et al.  Practical Measures of Integrated Information for Time-Series Data , 2011, PLoS Comput. Biol..

[13]  G. Tononi An information integration theory of consciousness , 2004, BMC Neuroscience.

[14]  Toru Yanagawa,et al.  Measuring Integrated Information from the Decoding Perspective , 2015, PLoS Comput. Biol..

[15]  S. Thomas McCormick,et al.  Submodular Function Minimization , 2005 .

[16]  G. Tononi,et al.  Dreaming and the brain: from phenomenology to neurophysiology , 2010, Trends in Cognitive Sciences.

[18]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[19]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[20]  J. Csicsvari,et al.  Organization of cell assemblies in the hippocampus , 2003, Nature.

[21]  D. Hubel,et al.  Receptive fields, binocular interaction and functional architecture in the cat's visual cortex , 1962, The Journal of physiology.

[22]  Maurice Queyranne,et al.  Minimizing symmetric submodular functions , 1998, Math. Program..

[23]  C. Koch,et al.  Integrated information theory: from consciousness to its physical substrate , 2016, Nature Reviews Neuroscience.

[24]  R. Yuste,et al.  Imprinting and recalling cortical ensembles , 2016, Science.

[25]  Giulio Tononi,et al.  Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework , 2008, PLoS Comput. Biol..

[26]  Shun-ichi Amari,et al.  Unified framework for information integration based on information geometry , 2015, Proceedings of the National Academy of Sciences.

[27]  David R. Anderson,et al.  Model selection and multimodel inference : a practical information-theoretic approach , 2003 .

[28]  Konrad P Kording,et al.  How advances in neural recording affect data analysis , 2011, Nature Neuroscience.

[29]  Ryota Kanai,et al.  Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory , 2017, Entropy.