An Ensemble Deep Active Learning Method for Intent Classification
暂无分享,去创建一个
[1] Ruixi Lin,et al. Enhancing Chinese Intent Classification by Dynamically Integrating Character Features into Word Embeddings with Ensemble Techniques , 2018, ArXiv.
[2] Li Tang,et al. Attention-Based CNN-BLSTM Networks for Joint Intent Detection and Slot Filling , 2018, CCL.
[3] Katrin Kirchhoff,et al. Simple, Fast, Accurate Intent Classification and Slot Labeling for Goal-Oriented Dialogue Systems , 2019, SIGdial.
[4] Zachary C. Lipton,et al. Deep Bayesian Active Learning for Natural Language Processing: Results of a Large-Scale Empirical Study , 2018, EMNLP.
[5] Zoubin Ghahramani,et al. Deep Bayesian Active Learning with Image Data , 2017, ICML.
[6] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[7] Wanxiang Che,et al. Pre-Training with Whole Word Masking for Chinese BERT , 2019, ArXiv.
[8] Frédéric Precioso,et al. Adversarial Active Learning for Deep Networks: a Margin Based Approach , 2018, ArXiv.
[9] Bing Liu,et al. Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling , 2016, INTERSPEECH.
[10] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[11] Wen Wang,et al. BERT for Joint Intent Classification and Slot Filling , 2019, ArXiv.
[12] Dan Wang,et al. A new active labeling method for deep learning , 2014, 2014 International Joint Conference on Neural Networks (IJCNN).
[13] He Tingting,et al. Attention-Based CNN-BLSTM Networks for Joint Intent Detection and Slot Filling , 2018 .
[14] Weinan Zhang,et al. An Evaluation of Chinese Human-Computer Dialogue Technology , 2019, Data Intelligence.
[15] Silvio Savarese,et al. Active Learning for Convolutional Neural Networks: A Core-Set Approach , 2017, ICLR.