Sampling Bias in Deep Active Classification: An Empirical Study
暂无分享,去创建一个
[1] Tong Zhang,et al. Deep Pyramid Convolutional Neural Networks for Text Categorization , 2017, ACL.
[2] Baoxin Wang,et al. Disconnected Recurrent Neural Networks for Text Categorization , 2018, ACL.
[3] Yann LeCun,et al. Very Deep Convolutional Networks for Text Classification , 2016, EACL.
[4] Jason Baldridge,et al. Active Learning and the Total Cost of Annotation , 2004, EMNLP.
[5] Nathan Srebro,et al. The Implicit Bias of Gradient Descent on Separable Data , 2017, J. Mach. Learn. Res..
[6] Bo Huang,et al. A New Method of Region Embedding for Text Classification , 2018, ICLR.
[7] Raymond J. Mooney,et al. Active Learning for Natural Language Parsing and Information Extraction , 1999, ICML.
[8] Andrew McCallum,et al. Employing EM and Pool-Based Active Learning for Text Classification , 1998, ICML.
[9] Tong Zhang,et al. The Value of Unlabeled Data for Classification Problems , 2000, ICML 2000.
[10] Sebastian Ruder,et al. Universal Language Model Fine-tuning for Text Classification , 2018, ACL.
[11] Silvio Savarese,et al. Active Learning for Convolutional Neural Networks: A Core-Set Approach , 2017, ICLR.
[12] Prasad Tadepalli,et al. Active Learning with Committees for Text Categorization , 1997, AAAI/IAAI.
[13] Ye Zhang,et al. Active Discriminative Text Representation Learning , 2016, AAAI.
[14] Klaus Brinker,et al. Incorporating Diversity in Active Learning with Support Vector Machines , 2003, ICML.
[15] Frédéric Precioso,et al. Adversarial Active Learning for Deep Networks: a Margin Based Approach , 2018, ArXiv.
[16] Yi Zhou,et al. When Will Gradient Methods Converge to Max-margin Classifier under ReLU Models? , 2018 .
[17] Tianshun Yao,et al. Active Learning with Sampling by Uncertainty and Density for Word Sense Disambiguation and Text Classification , 2008, COLING.
[18] Burr Settles,et al. Active Learning Literature Survey , 2009 .
[19] Byron C. Wallace,et al. How transferable are the datasets collected by active learners? , 2018, ArXiv.
[20] Zachary C. Lipton,et al. Deep Bayesian Active Learning for Natural Language Processing: Results of a Large-Scale Empirical Study , 2018, EMNLP.
[21] Andreas Nürnberger,et al. The Power of Ensembles for Active Learning in Image Classification , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[22] Jitendra Malik,et al. Are All Training Examples Created Equal? An Empirical Study , 2018, ArXiv.
[23] Xiang Zhang,et al. Character-level Convolutional Networks for Text Classification , 2015, NIPS.
[24] Rong Jin,et al. Large-scale text categorization by batch mode active learning , 2006, WWW '06.
[25] Bingsheng He,et al. ThunderSVM: A Fast SVM Library on GPUs and CPUs , 2018, J. Mach. Learn. Res..
[26] Andrew McCallum,et al. Toward Optimal Active Learning through Sampling Estimation of Error Reduction , 2001, ICML.
[27] Yi Zhou,et al. Convergence of SGD in Learning ReLU Models with Separable Data , 2018, ArXiv.
[28] Anders Krogh,et al. Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.
[29] Shai Shalev-Shwartz,et al. Discriminative Active Learning , 2019, ArXiv.
[30] Min Tang,et al. Active Learning for Statistical Natural Language Parsing , 2002, ACL.
[31] Yoshua Bengio,et al. An Empirical Study of Example Forgetting during Deep Neural Network Learning , 2018, ICLR.
[32] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[33] Christopher D. Manning,et al. Baselines and Bigrams: Simple, Good Sentiment and Topic Classification , 2012, ACL.
[34] Zoubin Ghahramani,et al. Deep Bayesian Active Learning with Image Data , 2017, ICML.
[35] Kilian Q. Weinberger,et al. On Calibration of Modern Neural Networks , 2017, ICML.
[36] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[37] Bernt Schiele,et al. RALF: A reinforced active learning formulation for object class recognition , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.
[38] Mark Craven,et al. An Analysis of Active Learning Strategies for Sequence Labeling Tasks , 2008, EMNLP.
[39] Charles Blundell,et al. Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles , 2016, NIPS.
[40] Matthijs Douze,et al. FastText.zip: Compressing text classification models , 2016, ArXiv.
[41] Tomas Mikolov,et al. Bag of Tricks for Efficient Text Classification , 2016, EACL.
[42] Tian Gan,et al. Explicit Interaction Model towards Text Classification , 2018, AAAI.
[43] Daphne Koller,et al. Support Vector Machine Active Learning with Applications to Text Classification , 2000, J. Mach. Learn. Res..
[44] Kentaro Inui,et al. Selective Sampling for Example-based Word Sense Disambiguation , 1998, CL.
[45] William A. Gale,et al. A sequential algorithm for training text classifiers , 1994, SIGIR '94.
[46] Hossein Mobahi,et al. Semantic Redundancies in Image-Classification Datasets: The 10% You Don't Need , 2019, ArXiv.
[47] Zhidong Deng,et al. Densely Connected CNN with Multi-scale Feature Attention for Text Classification , 2018, IJCAI.