暂无分享,去创建一个
[1] Roberto Navigli,et al. Nasari: Integrating explicit knowledge and corpus statistics for a multilingual representation of concepts and entities , 2016, Artif. Intell..
[2] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[3] Christopher D. Manning,et al. Better Word Representations with Recursive Neural Networks for Morphology , 2013, CoNLL.
[4] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[5] Zhaohui Wu,et al. Sense-Aaware Semantic Analysis: A Multi-Prototype Word Representation Model Using Wikipedia , 2015, AAAI.
[6] Timothy Baldwin,et al. unimelb: Topic Modelling-based Word Sense Induction for Web Snippet Clustering , 2013, SemEval@NAACL-HLT.
[7] Stuart German,et al. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images , 1988 .
[8] Andrew Y. Ng,et al. Improving Word Representations via Global Context and Multiple Word Prototypes , 2012, ACL.
[9] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[10] Heng Zhang,et al. Improving short text classification by learning vector representations of both words and hidden topics , 2016, Knowl. Based Syst..
[11] Zhiyuan Liu,et al. A Unified Model for Word Sense Representation and Disambiguation , 2014, EMNLP.
[12] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[13] Ignacio Iacobacci,et al. Embedding Words and Senses Together via Joint Knowledge-Enhanced Training , 2016, CoNLL.
[14] Ignacio Iacobacci,et al. SensEmbed: Learning Sense Embeddings for Word and Relational Similarity , 2015, ACL.
[15] Xuanjing Huang,et al. Learning Context-Sensitive Word Embeddings with Neural Tensor Skip-Gram Model , 2015, IJCAI.
[16] Daniel Jurafsky,et al. Do Multi-Sense Embeddings Improve Natural Language Understanding? , 2015, EMNLP.
[17] Ehud Rivlin,et al. Placing search in context: the concept revisited , 2002, TOIS.
[18] Chris Dyer,et al. Ontologically Grounded Multi-sense Representation Learning for Semantic Vector Space Models , 2015, NAACL.
[19] Michael I. Jordan,et al. Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..
[20] Hsin-Hsi Chen,et al. GenSense: A Generalized Sense Retrofitting Model , 2018, COLING.
[21] Christian Biemann,et al. Making Sense of Word Embeddings , 2016, Rep4NLP@ACL.
[22] Zhiyuan Liu,et al. Topical Word Embeddings , 2015, AAAI.
[23] Stefan Thater,et al. A Mixture Model for Learning Multi-Sense Word Embeddings , 2017, *SEMEVAL.
[24] Ji-Rong Wen,et al. Contextual Text Understanding in Distributional Semantic Space , 2015, CIKM.
[25] Andrew McCallum,et al. Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space , 2014, EMNLP.
[26] Jeffrey Dean,et al. Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.
[27] Donald Geman,et al. Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[28] Enhong Chen,et al. A Probabilistic Model for Learning Multi-Prototype Word Embeddings , 2014, COLING.
[29] Hinrich Schütze,et al. AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes , 2015, ACL.
[30] Grahame B. Smith. Stuart Geman and Donald Geman, “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images”; , 1987 .
[31] John B. Goodenough,et al. Contextual correlates of synonymy , 1965, CACM.
[32] Raymond J. Mooney,et al. Multi-Prototype Vector-Space Models of Word Meaning , 2010, NAACL.
[33] Nigel Collier,et al. De-Conflated Semantic Representations , 2016, EMNLP.