暂无分享,去创建一个
M. de Rijke | Christof Monz | Maarten de Rijke | Shaojie Jiang | Thomas Wolf | Christof Monz | Thomas Wolf | Shaojie Jiang
[1] Bowen Zhou,et al. Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond , 2016, CoNLL.
[2] Jason Weston,et al. Don't Say That! Making Inconsistent Dialogue Unlikely with Unlikelihood Training , 2020, ACL.
[3] Jason Weston,et al. ELI5: Long Form Question Answering , 2019, ACL.
[4] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[5] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[6] Yang Liu,et al. Context Gates for Neural Machine Translation , 2016, TACL.
[7] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[8] Richard Socher,et al. A Deep Reinforced Model for Abstractive Summarization , 2017, ICLR.
[9] Kaiming He,et al. Focal Loss for Dense Object Detection , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).
[10] Jason Weston,et al. Neural Text Generation with Unlikelihood Training , 2019, ICLR.
[11] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[12] Jürgen Schmidhuber,et al. Training Very Deep Networks , 2015, NIPS.
[13] Cristian Danescu-Niculescu-Mizil,et al. Chameleons in Imagined Conversations: A New Approach to Understanding Coordination of Linguistic Style in Dialogs , 2011, CMCL@ACL.
[14] Rita Cucchiara,et al. Paying More Attention to Saliency , 2017, ACM Trans. Multim. Comput. Commun. Appl..
[15] Jianfeng Gao,et al. A Diversity-Promoting Objective Function for Neural Conversation Models , 2015, NAACL.
[16] Jason Weston,et al. ParlAI: A Dialog Research Software Platform , 2017, EMNLP.
[17] M. de Rijke,et al. Finding Influential Training Samples for Gradient Boosted Decision Trees , 2018, ICML.
[18] Yann Dauphin,et al. Hierarchical Neural Story Generation , 2018, ACL.
[19] James R. Glass,et al. Negative Training for Neural Dialogue Response Generation , 2019, ACL.
[20] Salim Roukos,et al. Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.
[21] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[22] Masaaki Nagata,et al. Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization , 2016, EACL.
[23] Joelle Pineau,et al. The Ubuntu Dialogue Corpus: A Large Dataset for Research in Unstructured Multi-Turn Dialogue Systems , 2015, SIGDIAL Conference.
[24] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[25] Steven Bird,et al. NLTK: The Natural Language Toolkit , 2002, ACL 2006.
[26] Yejin Choi,et al. The Curious Case of Neural Text Degeneration , 2019, ICLR.
[27] Jianfeng Gao,et al. Deep Reinforcement Learning for Dialogue Generation , 2016, EMNLP.
[28] Lav R. Varshney,et al. CTRL: A Conditional Transformer Language Model for Controllable Generation , 2019, ArXiv.
[29] Razvan Pascanu,et al. On the difficulty of training recurrent neural networks , 2012, ICML.
[30] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[31] Zhiguo Wang,et al. Coverage Embedding Models for Neural Machine Translation , 2016, EMNLP.
[32] Yang Liu,et al. Modeling Coverage for Neural Machine Translation , 2016, ACL.
[33] Steven Bird,et al. NLTK: The Natural Language Toolkit , 2002, ACL.
[34] M. de Rijke,et al. Improving Neural Response Diversity with Frequency-Aware Cross-Entropy Loss , 2019, WWW.
[35] Xiaoyu Shen,et al. DailyDialog: A Manually Labelled Multi-turn Dialogue Dataset , 2017, IJCNLP.
[36] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[37] Ross B. Girshick,et al. Focal Loss for Dense Object Detection , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.