Inter-robot transfer learning for perceptual classification

We introduce the novel problem of inter-robot transfer learning for perceptual classification of objects, where multiple heterogeneous robots communicate and transfer learned object models consisting of a fusion of multiple object properties. Unlike traditional transfer learning, there can be severe differences in the data distributions, resulting from differences in sensing, sensory processing, or even representations, that each robot uses to learn. Furthermore, only some properties may overlap between the two robots. We show that in such cases, the abstraction of raw sensory data into an intermediate representation can be used not only to aid learning, but also the transfer of knowledge. Further, we utilize statistical metrics, learned during an interactive process where the robots jointly explore the environment, to determine which underlying properties are shared between the robots. We demonstrate results in a visual classification task where objects are represented via a combination of properties derived from different modalities: color, texture, shape, and size. Using our methods, two heterogeneous robots utilizing different sensors and representations are able to successfully transfer support vector machine (SVM) classifiers among each other, resulting in speedups during learning.

[1]  Barbara Caputo,et al.  SVM-based Transfer of Visual Knowledge Across Robotic Platforms , 2007, ICVS 2007.

[2]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[3]  N. Foo Conceptual Spaces—The Geometry of Thought , 2022 .

[4]  Zsolt Kira,et al.  Modeling Robot Differences by Leveraging A Physically Shared Context , 2007 .

[5]  Peter Stone,et al.  Transfer Learning for Reinforcement Learning Domains: A Survey , 2009, J. Mach. Learn. Res..

[6]  Alexander Zelinsky,et al.  Grounded Symbolic Communication between Heterogeneous Cooperating Robots , 2000, Auton. Robots.

[7]  Luc Steels,et al.  Bootstrapping grounded word semantics , 1999 .

[8]  Aude Billard,et al.  Grounding communication in autonomous robots: An experimental study , 1998, Robotics Auton. Syst..

[9]  Zsolt Kira,et al.  Transferring embodied concepts between perceptually heterogeneous robots , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Pietro Perona,et al.  One-shot learning of object categories , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Antonio Torralba,et al.  Sharing Visual Features for Multiclass and Multiview Object Detection , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Zsolt Kira,et al.  Mapping Grounded Object Properties across Perceptually Heterogeneous Embodiments , 2009, FLAIRS.

[13]  Tony Belpaeme,et al.  Social symbol grounding and language evolution , 2007 .

[14]  Rajat Raina,et al.  Self-taught learning: transfer learning from unlabeled data , 2007, ICML '07.

[15]  Jeff A. Bilmes,et al.  A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models , 1998 .

[16]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.