Detecting the functional similarities between tools using a hierarchical representation of outcomes

The ability to reason about multiple tools and their functional similarities is a prerequisite for intelligent tool use. This paper presents a model which allows a robot to detect the similarity between tools based on the environmental outcomes observed with each tool. To do this, the robot incrementally learns an adaptive hierarchical representation (i.e., a taxonomy) for the types of environmental changes that it can induce and detect with each tool. Using the learned taxonomies, the robot can infer the similarity between different tools based on the types of outcomes they produce. The results show that the robot is able to learn accurate outcome models for six different tools. In addition, the robot was able to detect the similarity between tools using the learned outcome models.

[1]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[2]  Azriel Rosenfeld,et al.  Recognition by Functional Parts , 1995, Comput. Vis. Image Underst..

[3]  Martin V. Butz,et al.  Anticipatory Behavior in Adaptive Learning Systems , 2003, Lecture Notes in Computer Science.

[4]  J. Sinapov,et al.  Learning and generalization of behavior-grounded tool affordances , 2007, 2007 IEEE 6th International Conference on Development and Learning.

[5]  Kevin Barraclough,et al.  I and i , 2001, BMJ : British Medical Journal.

[6]  E. Rivlin,et al.  Object classification by functional parts , 2002, Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission.

[7]  Kevin W. Bowyer,et al.  GRUFF-3: Generalizing the domain of a function-based recognition system , 1994, Pattern Recognit..

[8]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[9]  Andrew W. Moore,et al.  X-means: Extending K-means with Efficient Estimation of the Number of Clusters , 2000, ICML.

[10]  Claudio Gentile,et al.  Incremental Algorithms for Hierarchical Classification , 2004, J. Mach. Learn. Res..

[11]  Alexander Stoytchev,et al.  Behavior-Grounded Representation of Tool Affordances , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[12]  M. Dogar,et al.  Afford or Not to Afford : A New Formalization of Affordances Toward Affordance-Based Robot , 2007 .

[13]  C. Kemp,et al.  Robot Manipulation of Human Tools : Autonomous Detection and Control of Task Relevant Features , 2006 .

[14]  Wolfram Schenck,et al.  Training and Application of a Visual Forward Model for a Robot Camera Head , 2006, SAB ABiALS.

[15]  E. Menzel Animal Tool Behavior: The Use and Manufacture of Tools by Animals, Benjamin B. Beck. Garland STPM Press, New York and London (1980), 306, Price £24.50 , 1981 .

[16]  Robi Polikar,et al.  An Ensemble Approach for Incremental Learning in Nonstationary Environments , 2007, MCS.

[17]  Ruzena Bajcsy,et al.  Identification of functional features through observations and interactions , 1995 .

[18]  金田 重郎,et al.  C4.5: Programs for Machine Learning (書評) , 1995 .

[19]  Manuel Lopes,et al.  Learning Object Affordances: From Sensory--Motor Coordination to Imitation , 2008, IEEE Transactions on Robotics.