Skull-Closed Autonomous Development

It seems how the brain develops its representations inside its closed skull throughout the lifetime, while the child incrementally learns one new task after another. By closed skull, we mean that the brain (or the Central Nervous System) inside the skull is off limit to the teachers in the external environment, except its sensory ends and the motor ends. We present Where-What Network (WWN) 6, which has realized our goal of a fully developmental network with closed skull, which means that the human programmer is not allowed to handcraft the internal representation for any concepts about extra-body concepts. We present how the developmental program (DP) of WWN-6 enables the network to learn and perform for attending and recognizing objects in complex backgrounds while the skull is closed.

[1]  Zhuowen Tu,et al.  Image Parsing: Unifying Segmentation, Detection, and Recognition , 2005, International Journal of Computer Vision.

[2]  Charles Kemp,et al.  How to Grow a Mind: Statistics, Structure, and Abstraction , 2011, Science.

[3]  Juyang Weng,et al.  Where-what network 1: “Where” and “what” assist each other through top-down connections , 2008, 2008 7th IEEE International Conference on Development and Learning.

[4]  Juyang Weng,et al.  Top–Down Connections in Self-Organizing Hebbian Networks: Topographic Class Grouping , 2010, IEEE Transactions on Autonomous Mental Development.

[5]  D. V. van Essen,et al.  A neurobiological model of visual attention and invariant pattern recognition based on dynamic routing of information , 1993, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[6]  Juyang Weng,et al.  Assist Each Other Through Top-down Connections , 2008 .

[7]  Juyang Weng,et al.  Where-What Network 5: Dealing with scales for objects in complex backgrounds , 2011, The 2011 International Joint Conference on Neural Networks.

[8]  Juyang Weng,et al.  Synapse maintenance in the Where-What Networks , 2011, The 2011 International Joint Conference on Neural Networks.