Clarinet: A One-step Approach Towards Budget-friendly Unsupervised Domain Adaptation

In unsupervised domain adaptation (UDA), classifiers for the target domain are trained with massive true-label data from the source domain and unlabeled data from the target domain. However, it may be difficult to collect fully-true-label data in a source domain given a limited budget. To mitigate this problem, we consider a novel problem setting where the classifier for the target domain has to be trained with complementary-label data from the source domain and unlabeled data from the target domain named budget-friendly UDA (BFUDA). The key benefit is that it is much less costly to collect complementary-label source data (required by BFUDA) than collecting the true-label source data (required by ordinary UDA). To this end, the complementary label adversarial network (CLARINET) is proposed to solve the BFUDA problem. CLARINET maintains two deep networks simultaneously, where one focuses on classifying complementary-label source data and the other takes care of the source-to-target distributional adaptation. Experiments show that CLARINET significantly outperforms a series of competent baselines.

[1]  Feng Liu,et al.  Unsupervised Domain Adaptation with Sphere Retracting Transformation , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[2]  Bernhard Schölkopf,et al.  Domain Adaptation with Conditional Transferable Components , 2016, ICML.

[3]  Tatsuya Harada,et al.  Asymmetric Tri-training for Unsupervised Domain Adaptation , 2017, ICML.

[4]  Jun Zhu,et al.  Cluster Alignment With a Teacher for Unsupervised Domain Adaptation , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[5]  Piergiorgio Sartor,et al.  Unsupervised Domain Adaptation for ToF Data Denoising With Adversarial Learning , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Michael I. Jordan,et al.  Learning Transferable Features with Deep Adaptation Networks , 2015, ICML.

[7]  Koby Crammer,et al.  A theory of learning from different domains , 2010, Machine Learning.

[8]  Michael K. Ng,et al.  Learning Discriminative Correlation Subspace for Heterogeneous Domain Adaptation , 2017, IJCAI.

[9]  Dacheng Tao,et al.  Learning with Biased Complementary Labels , 2017, ECCV.

[10]  Dacheng Tao,et al.  Geometry-Aware Symmetric Domain Adaptation for Monocular Depth Estimation , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[11]  Jianmin Wang,et al.  Transferable Curriculum for Weakly-Supervised Domain Adaptation , 2019, AAAI.

[12]  Hao Zhang,et al.  Dual Adversarial Transfer for Sequence Labeling , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Xingrui Yu,et al.  Co-teaching: Robust training of deep neural networks with extremely noisy labels , 2018, NeurIPS.

[14]  Gang Niu,et al.  Learning from Complementary Labels , 2017, NIPS.

[15]  Charles X. Ling,et al.  Fast Generalized Distillation for Semi-Supervised Domain Adaptation , 2017, AAAI.

[16]  Gang Niu,et al.  Positive-Unlabeled Learning with Non-Negative Risk Estimator , 2017, NIPS.

[17]  Ivor W. Tsang,et al.  Masking: A New Perspective of Noisy Supervision , 2018, NeurIPS.

[18]  Feng Liu,et al.  Heterogeneous Domain Adaptation: An Unsupervised Approach , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[19]  Carlos D. Castillo,et al.  Generate to Adapt: Aligning Domains Using Generative Adversarial Networks , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[20]  Bernhard Schölkopf,et al.  Multi-Source Domain Adaptation: A Causal View , 2015, AAAI.

[21]  Ivor W. Tsang,et al.  A deep learning framework for Hybrid Heterogeneous Transfer Learning , 2019, Artif. Intell..

[22]  Gang Niu,et al.  Complementary-Label Learning for Arbitrary Losses and Models , 2018, ICML.

[23]  François Laviolette,et al.  Domain-Adversarial Training of Neural Networks , 2015, J. Mach. Learn. Res..

[24]  Dacheng Tao,et al.  Causal Generative Domain Adaptation Networks , 2018, ArXiv.

[25]  Michael I. Jordan,et al.  Conditional Adversarial Domain Adaptation , 2017, NeurIPS.

[26]  Narayanan Chatapuram Krishnan,et al.  Supervised Heterogeneous Domain Adaptation via Random Forests , 2016, IJCAI.

[27]  Abhishek Kumar,et al.  Semi-supervised Learning with GANs: Manifold Invariance with Improved Inference , 2017, NIPS.

[28]  Gang Niu,et al.  Butterfly: A Panacea for All Difficulties in Wildly Unsupervised Domain Adaptation , 2019, ArXiv.

[29]  Alexander J. Smola,et al.  Hilbert space embeddings of conditional distributions with applications to dynamical systems , 2009, ICML '09.