Object category recognition by a humanoid robot using behavior-grounded relational learning

The ability to form and recognize object categories is fundamental to human intelligence. This paper proposes a behavior-grounded relational classification model that allows a robot to recognize the categories of household objects. In the proposed approach, the robot initially explores the objects by applying five exploratory behaviors (lift, shake, drop, crush and push) on them while recording the proprioceptive and auditory sensory feedback produced by each interaction. The sensorimotor data is used to estimate multiple measures of similarity between the objects, each corresponding to a specific coupling between an exploratory behavior and a sensory modality. A graph-based recognition model is trained by extracting features from the estimated similarity relations, allowing the robot to recognize the category memberships of a novel object based on the object's similarity to the set of familiar objects. The framework was evaluated on an upper-torso humanoid robot with two large sets of household objects. The results show that the robot's model is able to recognize complex object categories (e.g., metal objects, empty bottles, etc.) significantly better than chance.

[1]  Wolfram Burgard,et al.  Unsupervised discovery of object classes from range data using latent Dirichlet allocation , 2009, Robotics: Science and Systems.

[2]  Jivko Sinapov,et al.  Interactive learning of the acoustic properties of household objects , 2009, 2009 IEEE International Conference on Robotics and Automation.

[3]  Nico Blodow,et al.  Towards 3D Point cloud based object maps for household environments , 2008, Robotics Auton. Syst..

[4]  Jivko Sinapov,et al.  The Boosting Effect of Exploratory Behaviors , 2010, AAAI.

[5]  Ronald Rosenfeld,et al.  Semi-supervised learning with graphs , 2005 .

[6]  Dermot Lynott,et al.  Modality exclusivity norms for 423 object properties , 2009, Behavior research methods.

[7]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[8]  Eric Krotkov,et al.  Robotic Perception of Material: Experiments with Shape-Invariant Acoustic Measures of Material Type , 1995, ISER.

[9]  Lise Getoor,et al.  Link mining: a survey , 2005, SKDD.

[10]  Jivko Sinapov,et al.  From Acoustic Object Recognition to Object Categorization by a Humanoid Robot , 2009 .

[11]  T. Power Play and Exploration in Children and Animals , 1999 .

[12]  J. R. Landis,et al.  The measurement of observer agreement for categorical data. , 1977, Biometrics.

[13]  Jivko Sinapov,et al.  The odd one out task: Toward an intelligence test for robots , 2010, 2010 IEEE 9th International Conference on Development and Learning.

[14]  Jivko Sinapov,et al.  Toward interactive learning of object categories by a robot: A case study with container and non-container objects , 2009, 2009 IEEE 8th International Conference on Development and Learning.

[15]  Stephen Hart,et al.  An intrinsic reward for affordance exploration , 2009, 2009 IEEE 8th International Conference on Development and Learning.

[16]  M. V. Velzen,et al.  Self-organizing maps , 2007 .

[17]  Lorenzo Natale,et al.  Tapping into Touch , 2005 .

[18]  Dinesh K. Pai,et al.  Active measurement of contact sounds , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[19]  Minoru Asada,et al.  Object Category Acquisition by Dynamic Touch , 2008, Adv. Robotics.

[20]  Geoffrey A. Hollinger,et al.  HERB: a home exploring robotic butler , 2010, Auton. Robots.

[21]  Tomoaki Nakamura,et al.  Multimodal object categorization by a robot , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  D. Muir,et al.  Three-year-olds' difficulty with the appearance--reality distinction: is it real or is it apparent? , 2000, Developmental psychology.

[23]  E. Sahin,et al.  Curiosity-driven learning of traversability affordance on a mobile robot , 2007, 2007 IEEE 6th International Conference on Development and Learning.

[24]  A. Chan,et al.  Growing hierarchical self organising map (GHSOM) toolbox: visualisations and enhancements , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[25]  C. Schenck,et al.  Interactive Object Recognition Using Proprioceptive Feedback , 2009 .

[26]  Andrew Y. Ng,et al.  STAIR: Hardware and Software Architecture , 2007 .

[27]  M. Heller Haptic Dominance in Form Perception: Vision versus Proprioception , 1992, Perception.

[28]  E. Gibson Exploratory behavior in the development of perceiving, acting, and the acquiring of knowledge. , 1988 .