Learning sensorimotor navigation using synchrony-based partner selection

Future robots are supposed to become our partners and share the environments where we live in our daily life. Considering the fact that they will have to co-exist with "nonexpert" people (elders, impaired people, children, etc.), we must rethink the way we design human/robot interactions. In this paper, we take a radical simplification route taking advantage from recent discoveries in low-level human interactions and dynamical motor control. Indeed, we argue for the need to take the dynamics of the interactions into account. Therefore, we propose a bio-inspired neuronal architecture to mimics adult/infant interactions that: (1) are initiated thanks to synchrony-based partner selection, (2) are maintained and re-engaged thanks to partner recognition and focus of attention, and (3) allow for learning sensorimotor navigation based on place/action associations. Our experiment shows good results for the learning of a navigation area and proves that this approach is promising for more complex tasks and interactions.

[1]  P. Rochat,et al.  Perceived self in infancy , 2000 .

[2]  Syed Khursheed Hasnain,et al.  Synchrony Detection as a Reinforcement Signal for Learning: Application to Human Robot Interaction , 2014 .

[3]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[4]  Cynthia Breazeal,et al.  Regulation and Entrainment in Human—Robot Interaction , 2000, Int. J. Robotics Res..

[5]  J. Nadel,et al.  Expectancies for social contingency in 2‐month‐olds , 1999 .

[6]  Cynthia Breazeal,et al.  Training a Robot via Human Feedback: A Case Study , 2013, ICSR.

[7]  Jerry Alan Fails,et al.  Interactive machine learning , 2003, IUI '03.

[8]  Philippe Gaussier,et al.  Interactive Teaching for Vision-Based Mobile Robots: A Sensory-Motor Approach , 2010, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[9]  Philippe Gaussier,et al.  “Synchrony” as a way to choose an interacting partner , 2012, 2012 IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL).

[10]  L. Gogate,et al.  Intersensory redundancy facilitates learning of arbitrary relations between vowel sounds and objects in seven-month-old infants. , 1998, Journal of experimental child psychology.

[11]  Chrystopher L. Nehaniv,et al.  Teaching robots by moulding behavior and scaffolding the environment , 2006, HRI '06.

[12]  J. Issartel,et al.  Unintended interpersonal co-ordination: “can we march to the beat of our own drum?” , 2007, Neuroscience Letters.

[13]  Nicola Ancona,et al.  Optical flow from 1-D correlation: Application to a simple time-to-crash detector , 2005, International Journal of Computer Vision.