Learning class-specific word embeddings
暂无分享,去创建一个
[1] Andrew Y. Ng,et al. Improving Word Representations via Global Context and Multiple Word Prototypes , 2012, ACL.
[2] Shan Wu,et al. A neural generative autoencoder for bilingual word embeddings , 2018, Inf. Sci..
[3] Fernando Diaz,et al. CrisisLex: A Lexicon for Collecting and Filtering Microblogged Communications in Crises , 2014, ICWSM.
[4] Andrew Gordon Wilson,et al. Probabilistic FastText for Multi-Sense Word Embeddings , 2018, ACL.
[5] Mirella Lapata,et al. Vector-based Models of Semantic Composition , 2008, ACL.
[6] Omer Levy,et al. Dependency-Based Word Embeddings , 2014, ACL.
[7] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[8] Ming Zhou,et al. Learning Sentiment-Specific Word Embedding for Twitter Sentiment Classification , 2014, ACL.
[9] Patrick F. Reidy. An Introduction to Latent Semantic Analysis , 2009 .
[10] Michael I. Jordan,et al. Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..
[11] J. A. Hartigan,et al. A k-means clustering algorithm , 1979 .
[12] Andrew McCallum,et al. Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space , 2014, EMNLP.
[13] Jason Weston,et al. Natural Language Processing (Almost) from Scratch , 2011, J. Mach. Learn. Res..
[14] Steven Skiena,et al. The Expressive Power of Word Embeddings , 2013, ArXiv.
[15] Xiaomo Liu,et al. Data Sets: Word Embeddings Learned from Tweets and General Data , 2017, ICWSM.
[16] Wanxiang Che,et al. Learning Sense-specific Word Embeddings By Exploiting Bilingual Resources , 2014, COLING.
[17] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[18] Mo Yu. Factor-based Compositional Embedding Models , 2014 .
[19] Mark Dredze,et al. Improving Lexical Embeddings with Semantic Knowledge , 2014, ACL.
[20] Giuseppe Attardi. DeepNL: a Deep Learning NLP pipeline , 2015, VS@HLT-NAACL.
[21] Douglas A. Reynolds,et al. Gaussian Mixture Models , 2018, Encyclopedia of Biometrics.
[22] Scharolta Katharina Siencnik. Adapting word2vec to Named Entity Recognition , 2015, NODALIDA.
[23] Hui Chen,et al. Bilinear joint learning of word and entity embeddings for Entity Linking , 2018, Neurocomputing.
[24] Yi Chen,et al. Learning Context-Specific Word/Character Embeddings , 2017, AAAI.
[25] Enhong Chen,et al. A Probabilistic Model for Learning Multi-Prototype Word Embeddings , 2014, COLING.
[26] Thomas L. Griffiths,et al. Evaluating Vector-Space Models of Word Representation, or, The Unreasonable Effectiveness of Counting Words Near Other Words , 2017, CogSci.
[27] Christian Biemann,et al. Making Sense of Word Embeddings , 2016, Rep4NLP@ACL.
[28] Wang Ling,et al. Two/Too Simple Adaptations of Word2Vec for Syntax Problems , 2015, NAACL.
[29] Evangelos Kanoulas,et al. Improving Word Embedding Compositionality using Lexicographic Definitions , 2018, WWW.
[30] Ken-ichi Kawarabayashi,et al. Using $k$-way Co-occurrences for Learning Word Embeddings , 2018, AAAI.
[31] Tomas Mikolov,et al. Bag of Tricks for Efficient Text Classification , 2016, EACL.
[32] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[33] Yoshua Bengio,et al. A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..
[34] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[35] Tomas Mikolov,et al. Enriching Word Vectors with Subword Information , 2016, TACL.
[36] Zellig S. Harris,et al. Distributional Structure , 1954 .
[37] Brian D. Davison,et al. Class-Specific Word Embedding through Linear Compositionality , 2018, 2018 IEEE International Conference on Big Data and Smart Computing (BigComp).
[38] Yu Hu,et al. Part-of-Speech Relevance Weights for Learning Word Embeddings , 2016, ArXiv.
[39] John Liu,et al. sense2vec - A Fast and Accurate Method for Word Sense Disambiguation In Neural Word Embeddings , 2015, ArXiv.
[40] Zhiyuan Liu,et al. A Unified Model for Word Sense Representation and Disambiguation , 2014, EMNLP.