暂无分享,去创建一个
[1] Edward R. Tufte,et al. Envisioning Information , 1990 .
[2] Jason Weston,et al. A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.
[3] Tao Li,et al. Visual Interrogation of Attention-Based Models for Natural Language Inference and Machine Comprehension , 2018, EMNLP.
[4] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[5] Jun-Seok Kim,et al. Interactive Visualization and Manipulation of Attention-based Neural Machine Translation , 2017, EMNLP.
[6] Samy Bengio,et al. Tensor2Tensor for Neural Machine Translation , 2018, AMTA.
[7] Byron C. Wallace,et al. Attention is not Explanation , 2019, NAACL.
[8] Yonatan Belinkov,et al. Analysis Methods in Neural Language Processing: A Survey , 2018, TACL.
[9] Anupam Datta,et al. Gender Bias in Neural Natural Language Processing , 2018, Logic, Language, and Security.
[10] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[11] Phil Blunsom,et al. Reasoning about Entailment with Neural Attention , 2015, ICLR.
[12] Alexander M. Rush,et al. Seq2seq-Vis: A Visual Debugging Tool for Sequence-to-Sequence Models , 2018, IEEE Transactions on Visualization and Computer Graphics.
[13] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[14] Yonatan Belinkov,et al. Identifying and Controlling Important Neurons in Neural Machine Translation , 2018, ICLR.
[15] Jieyu Zhao,et al. Gender Bias in Coreference Resolution: Evaluation and Debiasing Methods , 2018, NAACL.
[16] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.