Effective transfer learning of affordances for household robots

Learning how to use functional objects is essential for robots that are to carry out household tasks. However, learning every object from scratch would be a very naive and time-consuming approach. In this paper, we propose transfer learning of affordances to reduce the number of exploratory actions needed to learn how to use a new object. Through embodied interaction with the object, the robot discovers the object's similarity to previously learned objects by comparing their shape features and spatial relations between object parts. The robot actively selects object parts along with parameterized actions and evaluates the effects on-line. We demonstrate through real-world experiments with the humanoid robot NAO that our method is able to speed up the use of a new type of garbage can by transferring the affordances learned previously for similar garbage cans.

[1]  Marc Toussaint,et al.  Exploration in relational domains for model-based reinforcement learning , 2012, J. Mach. Learn. Res..

[2]  G. Dorffner,et al.  Learning to perceive affordances in a framework of developmental embodied cognition , 2007, 2007 IEEE 6th International Conference on Development and Learning.

[3]  Gavriel Salomon,et al.  T RANSFER OF LEARNING , 1992 .

[4]  Michael B. Miller Linear Regression Analysis , 2013 .

[5]  Giulio Sandini,et al.  Learning about objects through action - initial steps towards artificial cognition , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[6]  Florentin Wörgötter,et al.  Convexity based object partitioning for robot applications , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[7]  G. Metta,et al.  Exploring affordances and tool use on the iCub , 2013, 2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids).

[8]  Manuel Lopes,et al.  Learning Object Affordances: From Sensory--Motor Coordination to Imitation , 2008, IEEE Transactions on Robotics.

[9]  Maya Cakmak,et al.  To Afford or Not to Afford: A New Formalization of Affordances Toward Affordance-Based Robot Control , 2007, Adapt. Behav..

[10]  Carl E. Rasmussen,et al.  A Unifying View of Sparse Approximate Gaussian Process Regression , 2005, J. Mach. Learn. Res..

[11]  James M. Rehg,et al.  Guided pushing for object singulation , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Emre Ugur,et al.  Goal emulation and planning in perceptual space using learned affordances , 2011, Robotics Auton. Syst..

[13]  Koen V. Hindriks,et al.  Robot learning and use of affordances in goal-directed tasks , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Alexander Stoytchev,et al.  Behavior-Grounded Representation of Tool Affordances , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[15]  Mark Steedman,et al.  Object-Action Complexes: Grounded abstractions of sensory-motor processes , 2011, Robotics Auton. Syst..

[16]  Emre Ugur,et al.  Going beyond the perception of affordances: Learning how to actualize them through behavioral parameters , 2011, 2011 IEEE International Conference on Robotics and Automation.

[17]  Shaogang Ren,et al.  Object-object interaction affordance learning , 2014, Robotics Auton. Syst..

[18]  Alexandre Bernardino,et al.  Learning visual affordances of objects and tools through autonomous robot exploration , 2014, 2014 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC).

[19]  J. Gibson The Ecological Approach to Visual Perception , 1979 .