Information maintenance and statistical dependence reduction in simple neural networks

This study compares the ability of excitatory, feed-forward neural networks to construct good transformations on their inputs. The quality of such a transformation is judged by the minimization of two information measures: the information loss of the transformation and the statistical dependency of the output. The networks that are compared differ from each other in the parametric properties of their neurons and in their connectivity. The particular network parameters studied are output firing threshold, synaptic connectivity, and associative modification of connection weights. The network parameters that most directly affect firing levels are threshold and connectivity. Networks incorporating neurons with dynamic threshold adjustment produce better transformations. When firing threshold is optimized, sparser synaptic connectivity produces a better transformation than denser connectivity. Associative modification of synaptic weights confers only a slight advantage in the construction of optimal transformations. Additionally, our research shows that some environments are better suited than others for recoding. Specifically, input environments high in statistical dependence, i.e. those environments most in need of recoding, are more likely to undergo successful transformations.

[1]  W. Ross Ashby,et al.  Design for an Intelligence-Amplifier , 1956 .

[2]  W. Levy A computational approach to hippocampal function , 1989 .

[3]  K. Miller,et al.  Ocular dominance column development: analysis and simulation. , 1989, Science.

[4]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[5]  Peter Földiák,et al.  Adaptation and decorrelation in the cortex , 1989 .

[6]  S. Grossberg,et al.  Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors , 1976, Biological Cybernetics.

[7]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[8]  H. B. Barlow,et al.  Finding Minimum Entropy Codes , 1989, Neural Computation.

[9]  S. Laughlin,et al.  Predictive coding: a fresh view of inhibition in the retina , 1982, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[10]  Joseph J. Atick,et al.  Towards a Theory of Early Visual Processing , 1990, Neural Computation.

[11]  P. Földiák,et al.  Forming sparse representations by local anti-Hebbian learning , 1990, Biological Cybernetics.

[12]  V. Mountcastle Modality and topographic properties of single neurons of cat's somatic sensory cortex. , 1957, Journal of neurophysiology.

[13]  渡辺 慧,et al.  Knowing and guessing : a quantitative study of inference and information , 1969 .

[14]  H. K. Hartline,et al.  INHIBITORY INTERACTION OF RECEPTOR UNITS IN THE EYE OF LIMULUS , 1957, The Journal of general physiology.

[15]  Satosi Watanabe,et al.  Knowing and guessing , 1969 .

[16]  J Ambros-Ingerson,et al.  Simulation of paleocortex performs hierarchical clustering. , 1990, Science.

[17]  E. Bienenstock,et al.  Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex , 1982, The Journal of neuroscience : the official journal of the Society for Neuroscience.