暂无分享,去创建一个
Eunho Yang | Sung Ju Hwang | Jaehong Yoon | Wonyong Jeong | Eunho Yang | Jaehong Yoon | S. Hwang | Wonyong Jeong
[1] Shin Ishii,et al. Virtual Adversarial Training: A Regularization Method for Supervised and Semi-Supervised Learning , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[2] Anit Kumar Sahu,et al. Federated Optimization in Heterogeneous Networks , 2018, MLSys.
[3] David Berthelot,et al. MixMatch: A Holistic Approach to Semi-Supervised Learning , 2019, NeurIPS.
[4] Geoffrey French,et al. Self-ensembling for visual domain adaptation , 2017, ICLR.
[5] David Berthelot,et al. ReMixMatch: Semi-Supervised Learning with Distribution Alignment and Augmentation Anchoring , 2019, ArXiv.
[6] Xiaoyan Sun,et al. Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation , 2019, IEEE Transactions on Neural Networks and Learning Systems.
[7] Tapani Raiko,et al. Semi-supervised Learning with Ladder Networks , 2015, NIPS.
[8] Tolga Tasdizen,et al. Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning , 2016, NIPS.
[9] Quoc V. Le,et al. Unsupervised Data Augmentation for Consistency Training , 2019, NeurIPS.
[10] Huzefa Rangwala,et al. Asynchronous Online Federated Learning for Edge Devices with Non-IID Data , 2019, 2020 IEEE International Conference on Big Data (Big Data).
[11] Alexander Zien,et al. Label Propagation and Quadratic Criterion , 2006 .
[12] Alexander Gammerman,et al. Learning by Transduction , 1998, UAI.
[13] Xiaoyan Sun,et al. Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation , 2019, IEEE Transactions on Neural Networks and Learning Systems.
[14] David Berthelot,et al. FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence , 2020, NeurIPS.
[15] Javier R. Movellan,et al. Whose Vote Should Count More: Optimal Integration of Labels from Labelers of Unknown Expertise , 2009, NIPS.
[16] Quoc V. Le,et al. RandAugment: Practical data augmentation with no separate search , 2019, ArXiv.
[17] Dong-Hyun Lee,et al. Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks , 2013 .
[18] Geoffrey E. Hinton,et al. Using Deep Belief Nets to Learn Covariance Kernels for Gaussian Processes , 2007, NIPS.
[19] Blaise Agüera y Arcas,et al. Communication-Efficient Learning of Deep Networks from Decentralized Data , 2016, AISTATS.
[20] Huzefa Rangwala,et al. Asynchronous Online Federated Learning for Edge Devices , 2019, ArXiv.
[21] Quoc V. Le,et al. Randaugment: Practical automated data augmentation with a reduced search space , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).
[22] Yasaman Khazaeni,et al. Federated Learning with Matched Averaging , 2020, ICLR.
[23] Yue Zhao,et al. Federated Learning with Non-IID Data , 2018, ArXiv.
[24] Ameet Talwalkar,et al. One-Shot Federated Learning , 2019, ArXiv.
[25] Yoshua Bengio,et al. Semi-supervised Learning by Entropy Minimization , 2004, CAP.
[26] Yasaman Khazaeni,et al. Bayesian Nonparametric Federated Learning of Neural Networks , 2019, ICML.
[27] Abdullatif Albaseer,et al. Exploiting Unlabeled Data in Smart Cities using Federated Learning , 2020, ArXiv.
[28] Alexandros Karatzoglou,et al. Overcoming Catastrophic Forgetting with Hard Attention to the Task , 2018 .
[29] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..