Exploration of Noise Strategies in Semi-supervised Named Entity Classification
暂无分享,去创建一个
[1] Omer Levy,et al. Dependency-Based Word Embeddings , 2014, ACL.
[2] Hwee Tou Ng,et al. Towards Robust Linguistic Analysis using OntoNotes , 2013, CoNLL.
[3] Quoc V. Le,et al. Semi-Supervised Sequence Modeling with Cross-View Training , 2018, EMNLP.
[4] Mihai Surdeanu,et al. An Exploration of Three Lightly-supervised Representation Learning Approaches for Named Entity Classification , 2018, COLING.
[5] Shin Ishii,et al. Virtual Adversarial Training: A Regularization Method for Supervised and Semi-Supervised Learning , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[6] Mihai Surdeanu,et al. Keep Your Bearings: Lightly-Supervised Information Extraction with Ladder Networks That Avoids Semantic Drift , 2018, NAACL.
[7] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[8] Tapani Raiko,et al. Semi-supervised Learning with Ladder Networks , 2015, NIPS.
[9] Christopher D. Manning,et al. Distributed Representations of Words to Guide Bootstrapped Entity Classifiers , 2015, NAACL.
[10] Xiaojin Zhu,et al. --1 CONTENTS , 2006 .
[11] Tapani Raiko,et al. Semi-supervised Learning with Ladder Networks , 2015, NIPS.
[12] Erik F. Tjong Kim Sang,et al. Introduction to the CoNLL-2003 Shared Task: Language-Independent Named Entity Recognition , 2003, CoNLL.
[13] Jonathon Shlens,et al. Explaining and Harnessing Adversarial Examples , 2014, ICLR.
[14] Yonatan Belinkov,et al. Synthetic and Natural Noise Both Break Neural Machine Translation , 2017, ICLR.
[15] Harri Valpola,et al. Weight-averaged consistency targets improve semi-supervised deep learning results , 2017, ArXiv.