Mixed-Transfer: Transfer Learning over Mixed Graphs

Heterogeneous transfer learning has been proposed as a new learning strategy to improve performance in a target domain by leveraging data from other heterogeneous source domains where feature spaces can be different across different domains. In order to connect two different spaces, one common technique is to bridge feature spaces by using some co-occurrence data. For example, annotated images can be used to build feature mapping from words to image features, and then applied on text-to-image knowledge transfer. However, in practice, such co-occurrence data are often from Web, e.g. Flickr, and generated by users. That means these data can be sparse and contain personal biases. Directly building models based on them may fail to provide reliable bridge. To solve these aforementioned problems, in this paper, we propose a novel algorithm named MixedTransfer. It is composed of three components, that is, a cross domain harmonic function to avoid personal biases, a joint transition probability graph of mixed instances and features to model the heterogeneous transfer learning problem, a random walk process to simulate the label propagation on the graph and avoid the data sparsity problem. We conduct experiments on 171 real-world tasks, showing that the proposed approach outperforms four state-of-the-art heterogeneous transfer learning algorithms.

[1]  Qiang Yang,et al.  Translated Learning: Transfer Learning across Different Feature Spaces , 2008, NIPS.

[2]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[3]  Qiang Yang,et al.  Heterogeneous Transfer Learning for Image Classification , 2011, AAAI.

[4]  Qiang Yang,et al.  Heterogeneous Transfer Learning for Image Clustering via the SocialWeb , 2009, ACL.

[5]  C. A. Murthy,et al.  Unsupervised Feature Selection Using Feature Similarity , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Kevin Duh,et al.  Flexible sample selection strategies for transfer learning in ranking , 2012, Inf. Process. Manag..

[7]  M. Ng,et al.  Co-transfer learning via joint transition probability graph based method , 2012, CDKD '12.

[8]  Dong Liu,et al.  Tag ranking , 2009, WWW '09.

[9]  Dit-Yan Yeung,et al.  Multi-Task Learning in Heterogeneous Feature Spaces , 2011, AAAI.

[10]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[11]  G. Griffin,et al.  Caltech-256 Object Category Dataset , 2007 .

[12]  Deepak S. Turaga,et al.  Cross domain distribution adaptation via kernel mapping , 2009, KDD.

[13]  Rich Caruana,et al.  Multitask Learning , 1997, Machine-mediated learning.

[14]  David G. Lowe,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004, International Journal of Computer Vision.

[15]  Yi Chang,et al.  Pairwise cross-domain factor model for heterogeneous transfer ranking , 2012, WSDM '12.

[16]  Christos Faloutsos,et al.  Random walk with restart: fast solutions and applications , 2008, Knowledge and Information Systems.

[17]  Charu C. Aggarwal,et al.  Towards semantic knowledge propagation from text corpus to web images , 2011, WWW.

[18]  Qiang Yang,et al.  Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence Transfer Learning for Activity Recognition via Sensor Mapping , 2022 .

[19]  Qiang Yang,et al.  Boosting for transfer learning , 2007, ICML '07.

[20]  Qiang Yang,et al.  Transfer Learning for Collective Link Prediction in Multiple Heterogenous Domains , 2010, ICML.

[21]  Qiang Yang,et al.  Multi-transfer: Transfer learning with multiple views and multiple sources , 2014 .

[22]  Peter Stone,et al.  Transfer Learning for Reinforcement Learning Domains: A Survey , 2009, J. Mach. Learn. Res..

[23]  Qiang Yang,et al.  Transfer learning for collaborative filtering via a rating-matrix generative model , 2009, ICML '09.