Model-Free Context-Aware Word Composition
暂无分享,去创建一个
Bo An | Xianpei Han | Le Sun | Bo An | Xianpei Han | Le Sun
[1] Dimitri Kartsaklis,et al. Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning , 2015, EMNLP.
[2] Jeffrey Pennington,et al. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection , 2011, NIPS.
[3] Larry P. Heck,et al. Contextual LSTM (CLSTM) models for Large scale NLP tasks , 2016, ArXiv.
[4] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[5] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[6] Michael I. Jordan,et al. Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..
[7] Steven Schockaert,et al. Jointly Learning Word Embeddings and Latent Topics , 2017, SIGIR.
[8] Mirella Lapata,et al. Composition in Distributional Models of Semantics , 2010, Cogn. Sci..
[9] Francis Jeffry Pelletier,et al. Did Frege Believe Frege's Principle? , 2001, J. Log. Lang. Inf..
[10] Yoshimasa Tsuruoka,et al. Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures , 2014, EMNLP.
[11] Richard M. Schwartz,et al. Fast and Robust Neural Network Joint Models for Statistical Machine Translation , 2014, ACL.
[12] Kezhi Mao,et al. Topic-Aware Deep Compositional Models for Sentence Classification , 2017, IEEE/ACM Transactions on Audio, Speech, and Language Processing.
[13] Phong Le,et al. Compositional Distributional Semantics with Long Short Term Memory , 2015, *SEMEVAL.
[14] Omer Levy,et al. Neural Word Embedding as Implicit Matrix Factorization , 2014, NIPS.
[15] Naomi Feldman,et al. Weak semantic context helps phonetic learning in a model of infant language acquisition , 2014, ACL.
[16] Quoc V. Le,et al. Distributed Representations of Sentences and Documents , 2014, ICML.
[17] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[18] Jonathan Berant,et al. Contextualized Word Representations for Reading Comprehension , 2017, NAACL.
[19] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[20] Ido Dagan,et al. Learning Entailment Rules for Unary Templates , 2008, COLING.
[21] Andrew Y. Ng,et al. Semantic Compositionality through Recursive Matrix-Vector Spaces , 2012, EMNLP.
[22] Ignacio Iacobacci,et al. SensEmbed: Learning Sense Embeddings for Word and Relational Similarity , 2015, ACL.
[23] Peng Wang,et al. Short Text Clustering via Convolutional Neural Networks , 2015, VS@HLT-NAACL.
[24] Ido Dagan,et al. context2vec: Learning Generic Context Embedding with Bidirectional LSTM , 2016, CoNLL.
[25] Kevin Gimpel,et al. Towards Universal Paraphrastic Sentence Embeddings , 2015, ICLR.
[26] Zellig S. Harris,et al. Distributional Structure , 1954 .
[27] Jason Weston,et al. A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.
[28] Wanxiang Che,et al. Learning Sense-specific Word Embeddings By Exploiting Bilingual Resources , 2014, COLING.
[29] Enhong Chen,et al. A Probabilistic Model for Learning Multi-Prototype Word Embeddings , 2014, COLING.
[30] Christof Monz,et al. Learning Topic-Sensitive Word Representations , 2017, ACL.
[31] Jeffrey Pennington,et al. Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions , 2011, EMNLP.
[32] Andrew Y. Ng,et al. Improving Word Representations via Global Context and Multiple Word Prototypes , 2012, ACL.
[33] Ramón Fernández Astudillo,et al. Not All Contexts Are Created Equal: Better Word Representations with Variable Attention , 2015, EMNLP.
[34] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[35] Chris Quirk,et al. Unsupervised Construction of Large Paraphrase Corpora: Exploiting Massively Parallel News Sources , 2004, COLING.
[36] Naoaki Okazaki,et al. Learning Semantically and Additively Compositional Distributional Representations , 2016, ACL.
[37] Mark Steyvers,et al. Finding scientific topics , 2004, Proceedings of the National Academy of Sciences of the United States of America.
[38] Ting Liu,et al. Document Modeling with Gated Recurrent Neural Network for Sentiment Classification , 2015, EMNLP.
[39] Kevin Gimpel,et al. From Paraphrase Database to Compositional Paraphrase Model and Back , 2015, Transactions of the Association for Computational Linguistics.
[40] Kevin Gimpel,et al. Deep Multilingual Correlation for Improved Word Embeddings , 2015, HLT-NAACL.
[41] Christopher D. Manning,et al. Global Belief Recursive Neural Networks , 2014, NIPS.
[42] Zhiyuan Liu,et al. Topical Word Embeddings , 2015, AAAI.
[43] Andrew McCallum,et al. Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space , 2014, EMNLP.
[44] Sanja Fidler,et al. Skip-Thought Vectors , 2015, NIPS.