暂无分享,去创建一个
Xuan Li | Kaijie Zhou | Jianping Shen | Dawei Zhu | Zengfeng Zeng | Yue Ma | Yiying Yang | Xiaoyuan Yao
[1] Dilek Z. Hakkani-Tür,et al. Dialog State Tracking: A Neural Reading Comprehension Approach , 2019, SIGdial.
[2] Qi Hu,et al. An End-to-end Approach for Handling Unknown Slot Values in Dialogue State Tracking , 2018, ACL.
[3] Dilek Z. Hakkani-Tür,et al. Scalable multi-domain dialogue state tracking , 2017, 2017 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU).
[4] Philip S. Yu,et al. Find or Classify? Dual Strategy for Slot-Value Predictions on Multi-Domain Dialog State Tracking , 2019, STARSEM.
[5] Raghav Gupta,et al. Towards Scalable Multi-domain Conversational Agents: The Schema-Guided Dialogue Dataset , 2020, AAAI.
[6] Ian Lane,et al. BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer , 2019, INTERSPEECH.
[7] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[8] Percy Liang,et al. Know What You Don’t Know: Unanswerable Questions for SQuAD , 2018, ACL.
[9] Jiliang Tang,et al. A Survey on Dialogue Systems: Recent Advances and New Frontiers , 2017, SKDD.
[10] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[11] Myle Ott,et al. Understanding Back-Translation at Scale , 2018, EMNLP.
[12] David Vandyke,et al. Multi-domain Dialog State Tracking using Recurrent Neural Networks , 2015, ACL.
[13] Heng-Tze Cheng,et al. Wide & Deep Learning for Recommender Systems , 2016, DLRS@RecSys.
[14] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[15] Richard Socher,et al. Transferable Multi-Domain State Generator for Task-Oriented Dialogue Systems , 2019, ACL.
[16] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[17] Dilek Z. Hakkani-Tür,et al. Flexible and Scalable State Tracking Framework for Goal-Oriented Dialogue Systems , 2018, ArXiv.
[18] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[19] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[20] Gabriel Skantze,et al. A General, Abstract Model of Incremental Dialogue Processing , 2009, EACL.
[21] Fei Liu,et al. Dialog state tracking, a machine reading approach using Memory Network , 2016, EACL.
[22] Guokun Lai,et al. RACE: Large-scale ReAding Comprehension Dataset From Examinations , 2017, EMNLP.
[23] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.