暂无分享,去创建一个
Rishabh K. Iyer | Feng Chen | Xujiang Zhao | KrishnaTeja Killamsetty | Rishabh K. Iyer | Xujiang Zhao | Feng Chen | Krishnateja Killamsetty
[1] David Berthelot,et al. MixMatch: A Holistic Approach to Semi-Supervised Learning , 2019, NeurIPS.
[2] Nikos Komodakis,et al. Wide Residual Networks , 2016, BMVC.
[3] Zhi-Hua Zhou,et al. Safe Deep Semi-Supervised Learning for Unseen-Class Unlabeled Data , 2020, ICML.
[4] Bin Yang,et al. Learning to Reweight Examples for Robust Deep Learning , 2018, ICML.
[5] Noel E. O'Connor,et al. Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning , 2019, 2020 International Joint Conference on Neural Networks (IJCNN).
[6] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[7] Dumitru Erhan,et al. Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[8] Shin Ishii,et al. Virtual Adversarial Training: A Regularization Method for Supervised and Semi-Supervised Learning , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[9] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[10] Geoffrey E. Hinton,et al. Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[11] Hongyu Guo,et al. MixUp as Locally Linear Out-Of-Manifold Regularization , 2018, AAAI.
[12] Yoshua Bengio,et al. Semi-supervised Learning by Entropy Minimization , 2004, CAP.
[13] Xiaojin Zhu,et al. --1 CONTENTS , 2006 .
[14] Roland Vollgraf,et al. Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms , 2017, ArXiv.
[15] Kaizhu Huang,et al. Maximum margin semi-supervised learning with irrelevant data , 2015, Neural Networks.
[16] Alex Graves,et al. Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.
[17] Lina Yao,et al. Distributionally Robust Semi-Supervised Learning for People-Centric Sensing , 2018, AAAI.
[18] Matthias Hein,et al. Towards neural networks that provably know when they don't know , 2020, ICLR.
[19] Harri Valpola,et al. Weight-averaged consistency targets improve semi-supervised deep learning results , 2017, ArXiv.
[20] Dong-Hyun Lee,et al. Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks , 2013 .
[21] Ivor W. Tsang,et al. Robust Semi-Supervised Learning through Label Aggregation , 2016, AAAI.
[22] Yoshua Bengio,et al. Gradient-Based Optimization of Hyperparameters , 2000, Neural Computation.
[23] Colin Raffel,et al. Realistic Evaluation of Deep Semi-Supervised Learning Algorithms , 2018, NeurIPS.
[24] Tom E. Bishop,et al. Blind Image Restoration Using a Block-Stationary Signal Model , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.
[25] Paolo Frasconi,et al. Bilevel Programming for Hyperparameter Optimization and Meta-Learning , 2018, ICML.
[26] Shaogang Gong,et al. Semi-Supervised Learning under Class Distribution Mismatch , 2020, AAAI.
[27] Zoubin Ghahramani,et al. Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning , 2015, ICML.
[28] Timo Aila,et al. Temporal Ensembling for Semi-Supervised Learning , 2016, ICLR.
[29] Hongyi Zhang,et al. mixup: Beyond Empirical Risk Minimization , 2017, ICLR.
[30] Tolga Tasdizen,et al. Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning , 2016, NIPS.
[31] Michael R. Lyu,et al. Can irrelevant data help semi-supervised learning, why and how? , 2011, CIKM '11.
[32] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[33] Quoc V. Le,et al. Unsupervised Data Augmentation for Consistency Training , 2019, NeurIPS.
[34] David Duvenaud,et al. Optimizing Millions of Hyperparameters by Implicit Differentiation , 2019, AISTATS.
[35] Qi Xie,et al. Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting , 2019, NeurIPS.