Self-supervised learning of grasp dependent tool affordances on the iCub Humanoid robot

The ability to learn about and efficiently use tools constitutes a desirable property for general purpose humanoid robots, as it allows them to extend their capabilities beyond the limitations of their own body. Yet, it is a topic that has only recently been tackled from the robotics community. Most of the studies published so far make use of tool representations that allow their models to generalize the knowledge among similar tools in a very limited way. Moreover, most studies assume that the tool is always grasped in its common or canonical grasp position, thus not considering the influence of the grasp configuration in the outcome of the actions performed with them. In the current paper we present a method that tackles both issues simultaneously by using an extended set of functional features and a novel representation of the effect of the tool use. Together, they implicitly account for the grasping configuration and allow the iCub to generalize among tools based on their geometry. Moreover, learning happens in a self-supervised manner: First, the robot autonomously discovers the affordance categories of the tools by clustering the effect of their usage. These categories are subsequently used as a teaching signal to associate visually obtained functional features to the expected tool's affordance. In the experiments, we show how this technique can be effectively used to select, given a tool, the best action to achieve a desired effect.

[1]  Angelo Cangelosi,et al.  An open-source simulator for cognitive robotics research: the prototype of the iCub humanoid robot simulator , 2008, PerMIS.

[2]  Giulio Sandini,et al.  An experimental evaluation of a novel minimum-jerk cartesian controller for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  三嶋 博之 The theory of affordances , 2008 .

[4]  Tetsunari Inamura,et al.  Bayesian learning of tool affordances based on generalization of functional feature to estimate effects of unseen tools , 2013, Artificial Life and Robotics.

[5]  Dirk Kraft,et al.  A Survey of the Ontogeny of Tool Use: From Sensorimotor Experience to Planning , 2013, IEEE Transactions on Autonomous Mental Development.

[6]  Ching Y. Suen,et al.  A fast parallel algorithm for thinning digital patterns , 1984, CACM.

[7]  Emre Ugur,et al.  Self-discovery of motor primitives and learning grasp affordances , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Guojun Lu,et al.  Review of shape representation and description techniques , 2004, Pattern Recognit..

[9]  R. Shaw,et al.  Perceiving, Acting and Knowing : Toward an Ecological Psychology , 1978 .

[10]  Manuel Lopes,et al.  Learning Object Affordances: From Sensory--Motor Coordination to Imitation , 2008, IEEE Transactions on Robotics.

[11]  J. Sinapov,et al.  Detecting the functional similarities between tools using a hierarchical representation of outcomes , 2008, 2008 7th IEEE International Conference on Development and Learning.

[12]  Claude Sammut,et al.  Tool Use Learning in Robots , 2011, AAAI Fall Symposium: Advances in Cognitive Systems.

[13]  C. Kemp,et al.  Robot Manipulation of Human Tools : Autonomous Detection and Control of Task Relevant Features , 2006 .

[14]  Ronald C. Arkin,et al.  Robot tool behavior: a developmental approach to autonomous tool use , 2007 .

[15]  Dengsheng Zhang,et al.  A comparative study on shape retrieval using Fourier descriptiors with different shape signatures , 2001 .

[16]  Maya Cakmak,et al.  To Afford or Not to Afford: A New Formalization of Affordances Toward Affordance-Based Robot Control , 2007, Adapt. Behav..

[17]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[18]  Bernt Schiele,et al.  Functional Object Class Detection Based on Learned Affordance Cues , 2008, ICVS.

[19]  G. Metta,et al.  Exploring affordances and tool use on the iCub , 2013, 2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids).

[20]  Oliver Kroemer,et al.  Learning Continuous Grasp Affordances by Sensorimotor Exploration , 2010, From Motor Learning to Interaction Learning in Robots.

[21]  Alan Fern,et al.  Discriminatively trained particle filters for complex multi-object tracking , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[22]  Ronald P. A. Petrick,et al.  University of Southern Denmark Object-Action Complexes : Grounded Abstractions of Sensorimotor Processes , 2011 .

[23]  Thomas E. Horton,et al.  A Partial Contour Similarity-Based Approach to Visual Affordances in Habile Agents , 2017, IEEE Transactions on Cognitive and Developmental Systems.

[24]  Daniel P. Huttenlocher,et al.  Efficient Graph-Based Image Segmentation , 2004, International Journal of Computer Vision.

[25]  Danijel Skocaj,et al.  Self-supervised cross-modal online learning of basic object affordances for developmental robotic systems , 2010, 2010 IEEE International Conference on Robotics and Automation.

[26]  James M. Rehg,et al.  Affordance Prediction via Learned Object Attributes , 2011 .

[27]  Henrik Schiøler,et al.  Sociable Robots Through Self-Maintained Energy , 2006 .

[28]  David Vernon,et al.  A Roadmap for Cognitive Development in Humanoid Robots , 2011, Cognitive Systems Monographs.

[29]  Donald W. Bouldin,et al.  A Cluster Separation Measure , 1979, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[30]  Giorgio Metta,et al.  YARP: Yet Another Robot Platform , 2006 .

[31]  Manuel Lopes,et al.  Learning grasping affordances from local visual descriptors , 2009, 2009 IEEE 8th International Conference on Development and Learning.

[32]  Giorgio Metta,et al.  Reexamining Lucas-Kanade method for real-time independent motion detection: Application to the iCub humanoid robot , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[33]  Tetsunari Inamura,et al.  Learning of Tool Affordances for autonomous tool manipulation , 2011, 2011 IEEE/SICE International Symposium on System Integration (SII).