Towards learning strategies and exploration patterns for feature perception

During infancy, infants spend a lot of time visually exploring the scene around them. Over the first year of life, the level of detail that can be perceived visually increases significantly. In this study, the ability to perceive areas of interest w.r.t. human developmental change in vision, specifically acuity and field of view over the first year of life, is investigated. Two scenarios, namely learning through a series of developmental changes and learning without any constraints, shed light on how a humanoid robot scaffolds learning of interesting areas in the scene through different emergent exploratory behaviours. Divergence/convergence in features is reported, demonstrating a potential to be used at a higher level of understanding. Staged strategies with early sensory constraints and exploratory behaviour based on “similarity searches” improve the quality of acquired features and may be used as a mechanism for better on-line learning of objects knowledge.

[1]  M. Banks,et al.  Optical and photoreceptor immaturities limit the spatial and chromatic vision of human neonates. , 1988, Journal of the Optical Society of America. A, Optics and image science.

[2]  D L Mayer,et al.  Monocular acuity norms for the Teller Acuity Cards between ages one month and four years. , 1995, Investigative ophthalmology & visual science.

[3]  J. G. Bremner,et al.  Infant Development: Recent Advances , 1997 .

[4]  D. Maurer,et al.  Multiple sensitive periods in human visual development: evidence from visually deprived children. , 2005, Developmental psychobiology.

[5]  Patricia Shaw,et al.  A comparison of learning strategies for biologically constrained development of gaze control on an iCub robot , 2014, Auton. Robots.

[6]  Angela M. Brown,et al.  Visual field extent in children 3.5–30 months of age tested with a double-arc LED perimeter , 1998, Vision Research.

[7]  Patricia Shaw,et al.  A biologically constrained architecture for developmental learning of eye–head gaze control on a humanoid robot , 2013, Auton. Robots.

[8]  R. Adams,et al.  Visual acuity assessment from birth to three years using the acuity card procedure: cross-sectional and longitudinal samples. , 1990, Optometry and vision science : official publication of the American Academy of Optometry.

[9]  Anthony M. Norcia,et al.  Does chromatic sensitivity develop more slowly than luminance sensitivity? , 1993, Vision Research.

[10]  Daphne Maurer,et al.  The development of the temporal and nasal visual fields during infancy , 1992, Vision Research.

[11]  D. Hubel,et al.  The period of susceptibility to the physiological effects of unilateral eye closure in kittens , 1970, The Journal of physiology.

[12]  Kunihiko Fukushima,et al.  Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position , 1980, Biological Cybernetics.

[13]  Jean Piaget,et al.  The child and reality: Problems of genetic psychology , 1973 .

[14]  Alexandros Giagkos,et al.  Babybot challenge: Motor skills , 2015, 2015 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob).

[15]  D. Maurer,et al.  Visual acuity: the role of visual input in inducing postnatal change , 2001, Clinical Neuroscience Research.

[16]  H. Stevenson The Child and Reality: Problems of Genetic Psychology. , 1974 .

[17]  Hugh R. Wilson,et al.  Development of spatiotemporal mechanisms in infant vision , 1988, Vision Research.

[18]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..