Contrasting information theoretic decompositions of modulatory and arithmetic interactions in neural information processing systems

Biological and artificial neural systems are composed of many local processors, and their capabilities depend upon the transfer function that relates each local processor's outputs to its inputs. This paper uses a recent advance in the foundations of information theory to study the properties of local processors that use contextual input to amplify or attenuate transmission of information about their driving inputs. This advance enables the information transmitted by processors with two distinct inputs to be decomposed into those components unique to each input, that shared between the two inputs, and that which depends on both though it is in neither, i.e. synergy. The decompositions that we report here show that contextual modulation has information processing properties that contrast with those of all four simple arithmetic operators, that it can take various forms, and that the form used in our previous studies of artificial neural nets composed of local processors with both driving and contextual inputs is particularly well-suited to provide the distinctive capabilities of contextual modulation under a wide range of conditions. We argue that the decompositions reported here could be compared with those obtained from empirical neurobiological and psychophysical data under conditions thought to reflect contextual modulation. That would then shed new light on the underlying processes involved. Finally, we suggest that such decompositions could aid the design of context-sensitive machine learning algorithms.

[1]  Jim Kay,et al.  Partial and Entropic Information Decompositions of a Neuronal Modulatory Interaction , 2017, Entropy.

[2]  A. Clark,et al.  On the functions, mechanisms, and malfunctions of intracortical contextual modulation , 2015, Neuroscience & Biobehavioral Reviews.

[3]  C. Gilbert,et al.  Brain States: Top-Down Influences in Sensory Processing , 2007, Neuron.

[4]  Multiplying two numbers together in your head is a difficult task if you did not learn multiplication tables as a child. On the face of it, this is somewhat surprising given the remarkable power of the brain to perform , 2010 .

[5]  Emilio Salinas,et al.  Gain Modulation A Major Computational Principle of the Central Nervous System , 2000, Neuron.

[6]  Jim Kay,et al.  Partial information decomposition as a unified approach to the specification of neural goal functions , 2015, Brain and Cognition.

[7]  James P. Crutchfield,et al.  Unique information via dependency constraints , 2017, Journal of Physics A: Mathematical and Theoretical.

[8]  T. Sejnowski,et al.  Book Review: Gain Modulation in the Central Nervous System: Where Behavior, Neurophysiology, and Computation Meet , 2001, The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry.

[9]  Jim W Kay,et al.  Coherent Infomax as a Computational Goal for Neural Systems , 2011, Bulletin of mathematical biology.

[10]  R. Guillery,et al.  On the actions that one nerve cell can have on another: distinguishing "drivers" from "modulators". , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[11]  W. A. Phillips,et al.  Where the rubber meets the road: The importance of implementation , 2003, Behavioral and Brain Sciences.

[12]  Jim Kay,et al.  The discovery of structure by multi-stream networks of local processors with contextual guidance , 1995 .

[13]  Dario Floreano,et al.  Contextually guided unsupervised learning using local multivariate binary processors , 1998, Neural Networks.

[14]  Robin A. A. Ince Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.

[15]  Jonathan D. Cohen,et al.  The effects of neural gain on attention and learning , 2013, Nature Neuroscience.

[16]  M. Carandini,et al.  Normalization as a canonical neural computation , 2013, Nature Reviews Neuroscience.

[17]  S. Sherman,et al.  Drivers and Modulators in the Central Auditory Pathways , 2010, Frontiers in neuroscience.

[18]  Viola Priesemann,et al.  Bits from Brains for Biologically Inspired Computing , 2014, Front. Robot. AI.

[19]  Leslie M. Collins,et al.  Bayesian Context-Dependent Learning for Anomaly Classification in Hyperspectral Imagery , 2014, IEEE Transactions on Geoscience and Remote Sensing.

[20]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[21]  Naftali Tishby,et al.  Opening the Black Box of Deep Neural Networks via Information , 2017, ArXiv.

[22]  Emilio Salinas,et al.  Fast Remapping of Sensory Stimuli onto Motor Actions on the Basis of Contextual Modulation , 2004, The Journal of Neuroscience.

[23]  Frances S. Chance,et al.  Gain Modulation from Background Synaptic Input , 2002, Neuron.

[24]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[25]  William A. Phillips,et al.  Cognitive functions of intracellular mechanisms for contextual amplification , 2017, Brain and Cognition.

[26]  C. W. Harley,et al.  The effects of arousal on apical amplification and conscious state , 2016, Neuroscience of consciousness.

[27]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[28]  V.A.F. Lamme,et al.  Beyond the classical receptive field: Contextual modulation of V1 responses , 2004 .

[29]  M. Carandini,et al.  Parvalbumin-Expressing Interneurons Linearly Transform Cortical Responses to Visual Stimuli , 2012, Neuron.

[30]  B J Craven,et al.  Interactions between coincident and orthogonal cues to texture boundaries , 2000, Perception & psychophysics.

[31]  Daniel Chicharro,et al.  Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables , 2017, Entropy.