Cross Domain Shared Subspace Learning for Unsupervised Transfer Classification

Transfer learning aims to address the problem where we lack the labeled data for training in one domain while utilizing the sufficient training data from other relevant domains. The problem becomes even more challenging when there are no labeled data in the target domain to build the association between two domains, which is more common in real-world scenarios. In this paper, we tackle with the challenge through learning the shared subspace across domains. The subspace is able to capture the intrinsic domain invariant innate characteristics for feature representations. Meanwhile in the learning procedure we train the classifiers in the source domain and predict the labels in the target domain simultaneously. We also incorporate the inherent data structure in the predicted labels to enhance the robustness against the misclassification. Extensive experimental evaluations on the public datasets demonstrate the effectiveness and promise of our method compared with the state-of-the-art transfer learning methods.

[1]  Huan Liu,et al.  Unsupervised Feature Selection for Multi-View Data in Social Media , 2013, SDM.

[2]  Bernhard Schölkopf,et al.  Learning with Local and Global Consistency , 2003, NIPS.

[3]  Feiping Nie,et al.  Cross-language web page classification via dual knowledge transfer using nonnegative matrix tri-factorization , 2011, SIGIR.

[4]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[5]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[6]  Koby Crammer,et al.  Analysis of Representations for Domain Adaptation , 2006, NIPS.

[7]  Chris H. Q. Ding,et al.  Bridging Domains with Words: Opinion Analysis with Matrix Tri-factorizations , 2010, SDM.

[8]  Trevor Darrell,et al.  Adapting Visual Category Models to New Domains , 2010, ECCV.

[9]  G. Griffin,et al.  Caltech-256 Object Category Dataset , 2007 .

[10]  Ivor W. Tsang,et al.  Extracting discriminative concepts for domain adaptation in text mining , 2009, KDD.

[11]  Rama Chellappa,et al.  Subspace Interpolation via Dictionary Learning for Unsupervised Domain Adaptation , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[12]  Dan Zhang,et al.  Multi-view transfer learning with a large margin approach , 2011, KDD.

[13]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[14]  Kristen Grauman,et al.  Connecting the Dots with Landmarks: Discriminatively Learning Domain-Invariant Features for Unsupervised Domain Adaptation , 2013, ICML.

[15]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[16]  Rajat Raina,et al.  Self-taught learning: transfer learning from unlabeled data , 2007, ICML '07.

[17]  Hui Xiong,et al.  Exploiting Associations between Word Clusters and Document Classes for Cross-Domain Text Categorization , 2010, SDM.

[18]  Yuan Shi,et al.  Geodesic flow kernel for unsupervised domain adaptation , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  Qiang Yang,et al.  Transfer Learning via Dimensionality Reduction , 2008, AAAI.