Simulating the Emergence of Early Physical and Social Interactions : A Developmental Route through Low Level Visuomotor Learning

In this paper, we propose a bio-inspired and developmental neural model that allows a robot, after learning its own dynamics during a babbling phase, to gain imitative and shape recognition abilities leading to early attempts for physical and social interactions. We use a motor controller based on oscillators. During the babbling step, the robot learns to associate its motor primitives (oscillators) to the visual optical flow induced by its own arm. It also statically learn to recognize its arm by selecting moving local view (feature points) in the visual field. In real indoor experiments we demonstrate that, using the same model, early physical (reaching objects) and social (immediate imitation) interactions can emerge through visual ambiguities induced by the external visual stimuli.

[1]  Minoru Asada,et al.  Emergence of mirror neuron system: Immature vision leads to self-other correspondence , 2011, 2011 IEEE International Conference on Development and Learning (ICDL).

[2]  A. Meltzoff,et al.  The Robot in the Crib: A Developmental Analysis of Imitation Skills in Infants and Robots. , 2008, Infant and child development.

[3]  Rajesh P. N. Rao,et al.  Imitation and Social Learning in Robots, Humans and Animals: A Bayesian model of imitation in infants and robots , 2007 .

[4]  Sorin Moga,et al.  From Perception-Action Loops to Imitation Processes: A Bottom-Up Approach of Learning by Imitation , 1998, Appl. Artif. Intell..

[5]  Martin A. Giese,et al.  Possible influences of motor learning on perception of biological motion , 2004 .

[6]  Philippe Gaussier,et al.  Paladyn Journal of Behavioral Robotics a Synchrony-based Perspective for Partner Selection and Attentional Mechanism in Human-robot Interaction , 2022 .

[7]  C. Breazeal,et al.  Robots that imitate humans , 2002, Trends in Cognitive Sciences.

[8]  Arnaud Revel,et al.  Emergence of structured interactions: From a theoretical model to pragmatic robotics , 2009, Neural Networks.

[9]  A. Meltzoff,et al.  Imitation of Facial and Manual Gestures by Human Neonates , 1977, Science.

[10]  A. Meltzoff 'Like me': a foundation for social cognition. , 2007, Developmental science.

[11]  T. Poggio,et al.  Cognitive neuroscience: Neural mechanisms for the recognition of biological movements , 2003, Nature Reviews Neuroscience.

[12]  Michael A. Arbib,et al.  Mirror neurons and imitation: A computationally guided review , 2006, Neural Networks.

[13]  E. Adelson,et al.  The analysis of moving visual patterns , 1985 .

[14]  Philippe Gaussier,et al.  Visual navigation in an open environment without map , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[15]  P. Viviani,et al.  Biological movements look uniform: evidence of motor-perceptual interactions. , 1992, Journal of experimental psychology. Human perception and performance.

[16]  Patricia Shaw,et al.  A psychology based approach for longitudinal development in cognitive robotics , 2014, Front. Neurorobot..

[17]  Markus Lappe,et al.  The role of spatial and temporal information in biological motion perception , 2008, Advances in cognitive psychology.

[18]  José Santos-Victor,et al.  A Developmental Roadmap for Learning by Imitation in Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[19]  Stefan Treue,et al.  Different populations of neurons contribute to the detection and discrimination of visual motion , 2001, Vision Research.

[20]  Matthew T. Kaufman,et al.  Neural population dynamics during reaching , 2012, Nature.

[21]  J. Nadel,et al.  Expectancies for social contingency in 2‐month‐olds , 1999 .

[22]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[23]  Nahum Kiryati,et al.  Coarse to over-fine optical flow estimation , 2007, Pattern Recognit..