Low-Resource Domain Adaptation for Compositional Task-Oriented Semantic Parsing
暂无分享,去创建一个
Asish Ghoshal | Yashar Mehdad | Luke Zettlemoyer | Xilun Chen | Sonal Gupta | Luke Zettlemoyer | Xilun Chen | Yashar Mehdad | S. Gupta | Asish Ghoshal
[1] Zhou Yu,et al. Domain Adaptive Dialog Generation via Meta Learning , 2019, ACL.
[2] Sonal Gupta,et al. Semantic Parsing for Task Oriented Dialog using Hierarchical Representations , 2018, EMNLP.
[3] Sungjin Lee,et al. Zero-Shot Adaptive Transfer for Conversational Language Understanding , 2018, AAAI.
[4] Gökhan Tür,et al. Towards Zero-Shot Frame Semantic Parsing for Domain Scaling , 2017, INTERSPEECH.
[5] Joshua Achiam,et al. On First-Order Meta-Learning Algorithms , 2018, ArXiv.
[6] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[7] P. J. Price,et al. Evaluation of Spoken Language Systems: the ATIS Domain , 1990, HLT.
[8] Percy Liang,et al. Data Recombination for Neural Semantic Parsing , 2016, ACL.
[9] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[10] Spyridon Matsoukas,et al. Fast and Scalable Expansion of Natural Language Understanding Functionality for Intelligent Agents , 2018, NAACL-HLT.
[11] Christopher D. Manning,et al. Get To The Point: Summarization with Pointer-Generator Networks , 2017, ACL.
[12] Boi Faltings,et al. Meta-Learning for Low-resource Natural Language Generation in Task-oriented Dialogue Systems , 2019, IJCAI.
[13] Sergey Levine,et al. Meta-Learning with Implicit Gradients , 2019, NeurIPS.
[14] Haoran Li,et al. Conversational Semantic Parsing , 2020, EMNLP.
[15] Xiaocheng Feng,et al. Neural Semantic Parsing in Low-Resource Settings with Back-Translation and Meta-Learning , 2019, AAAI.
[16] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[17] Larry P. Heck,et al. Domain Adaptation of Recurrent Neural Networks for Natural Language Understanding , 2016, INTERSPEECH.
[18] Emilio Monti,et al. Don’t Parse, Generate! A Sequence to Sequence Architecture for Task-Oriented Semantic Parsing , 2020, WWW.
[19] Francesco Caltagirone,et al. Snips Voice Platform: an embedded Spoken Language Understanding system for private-by-design voice interfaces , 2018, ArXiv.
[20] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[21] Joshua B. Tenenbaum,et al. Human-level concept learning through probabilistic program induction , 2015, Science.
[22] Xing Fan,et al. Transfer Learning for Neural Semantic Parsing , 2017, Rep4NLP@ACL.
[23] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[24] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[25] Zi-Yi Dou,et al. Investigating Meta-Learning Algorithms for Low-Resource Natural Language Understanding Tasks , 2019, EMNLP.
[26] Luke S. Zettlemoyer,et al. Improving Semantic Parsing for Task Oriented Dialog , 2019, ArXiv.
[27] Omer Levy,et al. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.
[28] Myle Ott,et al. fairseq: A Fast, Extensible Toolkit for Sequence Modeling , 2019, NAACL.
[29] Sergey Levine,et al. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks , 2017, ICML.
[30] Amos J. Storkey,et al. How to train your MAML , 2018, ICLR.
[31] Yoshua Bengio,et al. Investigation of recurrent-neural-network architectures and learning methods for spoken language understanding , 2013, INTERSPEECH.
[32] Bing Liu,et al. Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling , 2016, INTERSPEECH.