Noise label learning through label confidence statistical inference
暂无分享,去创建一个
Min Wang | Hong-Tian Yu | Fan Min | Fan Min | Hong Yu | Min Wang | Hong-Tian Yu
[1] D. Steinberg. CART: Classification and Regression Trees , 2009 .
[2] Li Fei-Fei,et al. MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels , 2017, ICML.
[3] Masashi Sugiyama,et al. On Symmetric Losses for Learning from Corrupted Labels , 2019, ICML.
[4] Richard Nock,et al. Making Deep Neural Networks Robust to Label Noise: A Loss Correction Approach , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[5] Xiaojin Zhu,et al. --1 CONTENTS , 2006 .
[6] M. Verleysen,et al. Classification in the Presence of Label Noise: A Survey , 2014, IEEE Transactions on Neural Networks and Learning Systems.
[7] J. Ross Quinlan,et al. Induction of Decision Trees , 1986, Machine Learning.
[8] J. Friedman. Special Invited Paper-Additive logistic regression: A statistical view of boosting , 2000 .
[9] Naresh Manwani,et al. Noise Tolerance Under Risk Minimization , 2011, IEEE Transactions on Cybernetics.
[10] Arash Vahdat,et al. Toward Robustness against Label Noise in Training Deep Discriminative Neural Networks , 2017, NIPS.
[11] Yi Ding,et al. Augmentation Strategies for Learning with Noisy Labels , 2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[12] Abdulrahman H. Altalhi,et al. Statistical comparisons of active learning strategies over multiple datasets , 2018, Knowl. Based Syst..
[13] Bin Yang,et al. Learning to Reweight Examples for Robust Deep Learning , 2018, ICML.
[14] Junnan Li,et al. DivideMix: Learning with Noisy Labels as Semi-supervised Learning , 2020, ICLR.
[15] D. Angluin,et al. Learning From Noisy Examples , 1988, Machine Learning.
[16] Shin Ishii,et al. Virtual Adversarial Training: A Regularization Method for Supervised and Semi-Supervised Learning , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[17] Aritra Ghosh,et al. Making risk minimization tolerant to label noise , 2014, Neurocomputing.
[18] Gang Niu,et al. SemiNLL: A Framework of Noisy-Label Learning by Semi-Supervised Learning , 2020, Trans. Mach. Learn. Res..
[19] Xiaogang Wang,et al. Learning from massive noisy labeled data for image classification , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[20] Nagarajan Natarajan,et al. Learning with Noisy Labels , 2013, NIPS.
[21] Hailin Shi,et al. Co-Mining: Deep Face Recognition With Noisy Labels , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).
[22] Xingrui Yu,et al. Co-teaching: Robust training of deep neural networks with extremely noisy labels , 2018, NeurIPS.
[23] Li Fei-Fei,et al. MentorNet: Regularizing Very Deep Neural Networks on Corrupted Labels , 2017, ArXiv.
[24] Xingrui Yu,et al. How does Disagreement Help Generalization against Label Corruption? , 2019, ICML.
[25] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[26] Hongyi Zhang,et al. mixup: Beyond Empirical Risk Minimization , 2017, ICLR.
[27] Fabricio A. Breve,et al. Particle competition and cooperation for semi-supervised learning with label noise , 2015, Neurocomputing.
[28] Dacheng Tao,et al. Classification with Noisy Labels by Importance Reweighting , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[29] Francisco Herrera,et al. CNC-NOS: Class noise cleaning by ensemble filtering and noise scoring , 2018, Knowl. Based Syst..
[30] Geoffrey E. Hinton,et al. Regularizing Neural Networks by Penalizing Confident Output Distributions , 2017, ICLR.
[31] Aritra Ghosh,et al. Robust Loss Functions under Label Noise for Deep Neural Networks , 2017, AAAI.
[32] Michael I. Jordan,et al. Convexity, Classification, and Risk Bounds , 2006 .
[33] Matthew S. Nokleby,et al. Learning Deep Networks from Noisy Labels with Dropout Regularization , 2016, 2016 IEEE 16th International Conference on Data Mining (ICDM).
[34] Junmo Kim,et al. NLNL: Negative Learning for Noisy Labels , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).
[35] Weilong Yang,et al. Beyond Synthetic Noise: Deep Learning on Controlled Noisy Labels , 2019, ICML.
[36] Samy Bengio,et al. Understanding deep learning requires rethinking generalization , 2016, ICLR.
[37] Wei Li,et al. WebVision Database: Visual Learning and Understanding from Web Data , 2017, ArXiv.
[38] Jonathon Shlens,et al. Explaining and Harnessing Adversarial Examples , 2014, ICLR.
[39] Yi Yang,et al. A Multimedia Retrieval Framework Based on Semi-Supervised Ranking and Relevance Feedback , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[40] Lei Zhang,et al. CleanNet: Transfer Learning for Scalable Image Classifier Training with Label Noise , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[41] Mert R. Sabuncu,et al. Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels , 2018, NeurIPS.
[42] Zhi-Hua Zhou,et al. A brief introduction to weakly supervised learning , 2018 .
[43] Qinghua Hu,et al. Training Noise-Robust Deep Neural Networks via Meta-Learning , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[44] Xiaofei Zhang,et al. Density Peak-Based Noisy Label Detection for Hyperspectral Image Classification , 2019, IEEE Transactions on Geoscience and Remote Sensing.
[45] Susan T. Dumais,et al. Meta Label Correction for Noisy Label Learning , 2019, AAAI.
[46] Anima Anandkumar,et al. Learning From Noisy Singly-labeled Data , 2017, ICLR.