暂无分享,去创建一个
Yaser Al-Onaizan | Marcello Federico | Georgiana Dinu | Stanislas Lauly | Prashant Mathur | Marcello Federico | Georgiana Dinu | Y. Al-Onaizan | Stanislas Lauly | Prashant Mathur
[1] Michael Hahn,et al. Theoretical Limitations of Self-Attention in Neural Sequence Models , 2019, TACL.
[2] William Merrill,et al. Sequential Neural Networks as Automata , 2019, Proceedings of the Workshop on Deep Learning and Formal Languages: Building Bridges.
[3] Guillaume Lample,et al. Deep Learning for Symbolic Mathematics , 2019, ICLR.
[4] Adam Tauman Kalai,et al. Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings , 2016, NIPS.
[5] Hannaneh Hajishirzi,et al. MAWPS: A Math Word Problem Repository , 2016, NAACL.
[6] Dietrich Klakow,et al. Closing Brackets with Recurrent Neural Networks , 2018, BlackboxNLP@EMNLP.
[7] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[8] Patrick Pantel,et al. From Frequency to Meaning: Vector Space Models of Semantics , 2010, J. Artif. Intell. Res..
[9] Gabriel Stanovsky,et al. DROP: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs , 2019, NAACL.
[10] Philipp Koehn,et al. Europarl: A Parallel Corpus for Statistical Machine Translation , 2005, MTSUMMIT.
[11] Yonatan Belinkov,et al. On Evaluating the Generalization of LSTM Models in Formal Languages , 2018, ArXiv.
[12] Edouard Grave,et al. Colorless Green Recurrent Networks Dream Hierarchically , 2018, NAACL.
[13] Chris Dyer,et al. Neural Arithmetic Logic Units , 2018, NeurIPS.
[14] Carolyn Penstein Rosé,et al. EQUATE: A Benchmark Evaluation Framework for Quantitative Reasoning in Natural Language Inference , 2019, CoNLL.
[15] Alessandro Lenci,et al. Distributional Memory: A General Framework for Corpus-Based Semantics , 2010, CL.
[16] Sameer Singh,et al. Do NLP Models Know Numbers? Probing Numeracy in Embeddings , 2019, EMNLP.
[17] Rico Sennrich,et al. Linguistic Input Features Improve Neural Machine Translation , 2016, WMT.
[18] Tomas Mikolov,et al. Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets , 2015, NIPS.
[19] Xipeng Qiu,et al. Neural Arithmetic Expression Calculator , 2018, ArXiv.
[20] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[21] Marta R. Costa-jussà,et al. Findings of the 2019 Conference on Machine Translation (WMT19) , 2019, WMT.
[22] Jean-Philippe Bernardy,et al. Can Recurrent Neural Networks Learn Nested Recursion? , 2018, LILT.
[23] Eran Yahav,et al. On the Practical Computational Power of Finite Precision RNNs for Language Recognition , 2018, ACL.
[24] Shuming Shi,et al. Deep Neural Solver for Math Word Problems , 2017, EMNLP.
[25] Andy Way,et al. Getting Gender Right in Neural Machine Translation , 2019, EMNLP.
[26] Matt Post,et al. We start by defining the recurrent architecture as implemented in S OCKEYE , following , 2018 .
[27] Pushmeet Kohli,et al. Analysing Mathematical Reasoning Abilities of Neural Models , 2019, ICLR.
[28] Lucia Specia,et al. The IWSLT 2019 Evaluation Campaign , 2019, IWSLT.
[29] Robert C. Berwick,et al. Evaluating the Ability of LSTMs to Learn Context-Free Grammars , 2018, BlackboxNLP@EMNLP.
[30] Chitta Baral,et al. Learning To Use Formulas To Solve Simple Arithmetic Problems , 2016, ACL.
[31] Emmanuel Dupoux,et al. Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies , 2016, TACL.
[32] Jörg Tiedemann,et al. Parallel Data, Tools and Interfaces in OPUS , 2012, LREC.
[33] Andrew McCallum,et al. Energy and Policy Considerations for Deep Learning in NLP , 2019, ACL.