暂无分享,去创建一个
[1] Hongyi Zhang,et al. mixup: Beyond Empirical Risk Minimization , 2017, ICLR.
[2] Maya R. Gupta,et al. Deep k-NN for Noisy Labels , 2020, ICML.
[3] Cordelia Schmid,et al. What makes for good views for contrastive learning , 2020, NeurIPS.
[4] Takashi Matsubara,et al. RICAP: Random Image Cropping and Patching Data Augmentation for Deep CNNs , 2018, ACML.
[5] Samy Bengio,et al. Understanding deep learning requires rethinking generalization , 2016, ICLR.
[6] Thomas Brox,et al. SELF: Learning to Filter Noisy Labels with Self-Ensembling , 2019, ICLR.
[7] Shaogang Gong,et al. Unsupervised Deep Learning by Neighbourhood Discovery , 2019, ICML.
[8] Wei Li,et al. WebVision Database: Visual Learning and Understanding from Web Data , 2017, ArXiv.
[9] Sheng Liu,et al. Early-Learning Regularization Prevents Memorization of Noisy Labels , 2020, NeurIPS.
[10] Ce Liu,et al. Supervised Contrastive Learning , 2020, NeurIPS.
[11] Jae-Gil Lee,et al. Prestopping: How Does Early Stopping Help Generalization against Label Noise? , 2019, ArXiv.
[12] Gorjan Alagic,et al. #p , 2019, Quantum information & computation.
[13] Li Fei-Fei,et al. MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels , 2017, ICML.
[14] Vincent Gripon,et al. Leveraging the Feature Distribution in Transfer-based Few-Shot Learning , 2020, ICANN.
[15] Geoffrey E. Hinton,et al. A Simple Framework for Contrastive Learning of Visual Representations , 2020, ICML.
[16] Xingrui Yu,et al. How does Disagreement Help Generalization against Label Corruption? , 2019, ICML.
[17] Jae-Gil Lee,et al. SELFIE: Refurbishing Unclean Samples for Robust Deep Learning , 2019, ICML.
[18] Kiyoharu Aizawa,et al. Joint Optimization Framework for Learning with Noisy Labels , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[19] Gang Niu,et al. Are Anchor Points Really Indispensable in Label-Noise Learning? , 2019, NeurIPS.
[20] Nir Ailon,et al. Deep Metric Learning Using Triplet Network , 2014, SIMBAD.
[21] Kimin Lee,et al. Using Pre-Training Can Improve Model Robustness and Uncertainty , 2019, ICML.
[22] Li Fei-Fei,et al. ImageNet: A large-scale hierarchical image database , 2009, CVPR.
[23] Weilin Huang,et al. Cross-Batch Memory for Embedding Learning , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[24] Deliang Fan,et al. A Semi-Supervised Two-Stage Approach to Learning from Noisy Labels , 2018, 2018 IEEE Winter Conference on Applications of Computer Vision (WACV).
[25] Kaiming He,et al. Momentum Contrast for Unsupervised Visual Representation Learning , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[26] Honglak Lee,et al. Distilling Effective Supervision From Severe Label Noise , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[27] Seong Joon Oh,et al. CutMix: Regularization Strategy to Train Strong Classifiers With Localizable Features , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).
[28] Alan F. Smeaton,et al. Contrastive Representation Learning: A Framework and Review , 2020, IEEE Access.
[29] Yann LeCun,et al. Learning a similarity metric discriminatively, with application to face verification , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).
[30] Bo An,et al. Combating Noisy Labels by Agreement: A Joint Training Method with Co-Regularization , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[31] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[32] Aymeric Histace,et al. Metric Learning With HORDE: High-Order Regularizer for Deep Embeddings , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).
[33] Kun Yi,et al. Probabilistic End-To-End Noise Correction for Learning With Noisy Labels , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[34] Kevin Gimpel,et al. Using Trusted Data to Train Deep Networks on Labels Corrupted by Severe Noise , 2018, NeurIPS.
[35] Kaiming He,et al. Improved Baselines with Momentum Contrastive Learning , 2020, ArXiv.
[36] James Bailey,et al. Normalized Loss Functions for Deep Learning with Noisy Labels , 2020, ICML.
[37] Noel E. O'Connor,et al. Unsupervised label noise modeling and loss correction , 2019, ICML.
[38] Jeff A. Bilmes,et al. Combating Label Noise in Deep Learning Using Abstention , 2019, ICML.
[39] Xingrui Yu,et al. Co-teaching: Robust training of deep neural networks with extremely noisy labels , 2018, NeurIPS.
[40] Jian Sun,et al. Identity Mappings in Deep Residual Networks , 2016, ECCV.
[41] Mert R. Sabuncu,et al. Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels , 2018, NeurIPS.
[42] Junmo Kim,et al. NLNL: Negative Learning for Noisy Labels , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).
[43] Weilong Yang,et al. Beyond Synthetic Noise: Deep Learning on Controlled Noisy Labels , 2019, ICML.
[44] Ser-Nam Lim,et al. A Metric Learning Reality Check , 2020, ECCV.
[45] Le Song,et al. Iterative Learning with Open-set Noisy Labels , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[46] Noel E. O'Connor,et al. Towards Robust Learning with Different Label Noise Distributions , 2019, ArXiv.
[47] Aram Galstyan,et al. Improving Generalization by Controlling Label-Noise Information in Neural Network Weights , 2020, ICML.
[48] Noel E. O'Connor,et al. Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning , 2019, 2020 International Joint Conference on Neural Networks (IJCNN).
[49] Richard Nock,et al. Making Deep Neural Networks Robust to Label Noise: A Loss Correction Approach , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[50] Xing Ji,et al. CosFace: Large Margin Cosine Loss for Deep Face Recognition , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[51] Dumitru Erhan,et al. Training Deep Neural Networks on Noisy Labels with Bootstrapping , 2014, ICLR.
[52] Xiaogang Wang,et al. Deep Self-Learning From Noisy Labels , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).
[53] Jun Sun,et al. Safeguarded Dynamic Label Regression for Noisy Supervision , 2019, AAAI.
[54] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[55] Junnan Li,et al. DivideMix: Learning with Noisy Labels as Semi-supervised Learning , 2020, ICLR.
[56] Weilin Huang,et al. CurriculumNet: Weakly Supervised Learning from Large-Scale Web Images , 2018, ECCV.
[57] P. Alam. ‘N’ , 2021, Composites Engineering: An A–Z Guide.
[58] Qi Qian,et al. SoftTriple Loss: Deep Metric Learning Without Triplet Sampling , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).
[59] Yoshua Bengio,et al. A Closer Look at Memorization in Deep Networks , 2017, ICML.
[60] Danna Zhou,et al. d. , 1840, Microbial pathogenesis.