暂无分享,去创建一个
Hermann Ney | Christian Dugast | David Thulke | Nico Daheim | H. Ney | Nico Daheim | David Thulke | Christian Dugast
[1] Omer Levy,et al. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.
[2] Stefan Ultes,et al. MultiWOZ - A Large-Scale Multi-Domain Wizard-of-Oz Dataset for Task-Oriented Dialogue Modelling , 2018, EMNLP.
[3] Kilian Q. Weinberger,et al. Distance Metric Learning for Large Margin Nearest Neighbor Classification , 2005, NIPS.
[4] Iryna Gurevych,et al. Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks , 2019, EMNLP.
[5] Byeongchang Kim,et al. Sequential Latent Knowledge Selection for Knowledge-Grounded Dialogue , 2020, ICLR.
[6] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[7] Jason Weston,et al. Wizard of Wikipedia: Knowledge-Powered Conversational agents , 2018, ICLR.
[8] Dilek Z. Hakkani-Tür,et al. Beyond Domain APIs: Task-oriented Conversational Modeling with Unstructured Knowledge Access , 2020, SIGDIAL.
[9] Mitesh M. Khapra,et al. Towards Exploiting Background Knowledge for Building Conversation Systems , 2018, EMNLP.
[10] Thomas Wolf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[11] Fabio Petroni,et al. Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks , 2020, NeurIPS.
[12] Mihail Eric,et al. MultiWOZ 2. , 2019 .
[13] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[14] Hermann Ney,et al. Sisyphus, a Workflow Manager Designed for Machine Translation and Automatic Speech Recognition , 2018, EMNLP.
[15] Wei Wu,et al. Knowledge-Grounded Dialogue Generation with Pre-trained Language Models , 2020, EMNLP.
[16] Jeff Johnson,et al. Billion-Scale Similarity Search with GPUs , 2017, IEEE Transactions on Big Data.
[17] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[18] Kee-Eung Kim,et al. End-to-End Neural Pipeline for Goal-Oriented Dialogue Systems using GPT-2 , 2020, ACL.
[19] Danqi Chen,et al. Dense Passage Retrieval for Open-Domain Question Answering , 2020, EMNLP.
[20] Ming-Wei Chang,et al. REALM: Retrieval-Augmented Language Model Pre-Training , 2020, ICML.
[21] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[22] Ming-Wei Chang,et al. A Knowledge-Grounded Neural Conversation Model , 2017, AAAI.
[23] Sebastian Ruder,et al. An Overview of Multi-Task Learning in Deep Neural Networks , 2017, ArXiv.
[24] Xiaodong Liu,et al. Multi-Task Deep Neural Networks for Natural Language Understanding , 2019, ACL.
[25] Lav R. Varshney,et al. CTRL: A Conditional Transformer Language Model for Controllable Generation , 2019, ArXiv.
[26] Yejin Choi,et al. The Curious Case of Neural Text Degeneration , 2019, ICLR.
[27] Ivan Vulić,et al. Hello, It’s GPT-2 - How Can I Help You? Towards the Use of Pretrained Language Models for Task-Oriented Dialogue Systems , 2019, EMNLP.