Misspelling Oblivious Word Embeddings
暂无分享,去创建一个
Fabrizio Silvestri | Edouard Grave | Piotr Bojanowski | Aleksandra Piktus | Rui A. Ferreira | Bora Edizel | Rui Ferreira | Edouard Grave | Piotr Bojanowski | Aleksandra Piktus | Bora Edizel | F. Silvestri
[1] Vladimir I. Levenshtein,et al. Binary codes capable of correcting deletions, insertions, and reversals , 1965 .
[2] Geoffrey E. Hinton,et al. Learning distributed representations of concepts. , 1989 .
[3] T. Landauer,et al. Indexing by Latent Semantic Analysis , 1990 .
[4] Ellen M. Voorhees,et al. The TREC-8 Question Answering Track Report , 1999, TREC.
[5] Yoshua Bengio,et al. A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..
[6] Ehud Rivlin,et al. Placing search in context: the concept revisited , 2002, TOIS.
[7] Eric Brill,et al. Spelling Correction as an Iterative Process that Exploits the Collective Knowledge of Web Users , 2004, EMNLP.
[8] Jason Weston,et al. A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.
[9] C. Spearman. The proof and measurement of association between two things. , 2015, International journal of epidemiology.
[10] Jeffrey Dean,et al. Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.
[11] Christopher D. Manning,et al. Better Word Representations with Recursive Neural Networks for Morphology , 2013, CoNLL.
[12] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[13] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[14] Omer Levy,et al. Neural Word Embedding as Implicit Matrix Factorization , 2014, NIPS.
[15] Alexander M. Rush,et al. Character-Aware Neural Language Models , 2015, AAAI.
[16] Eduard H. Hovy,et al. End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF , 2016, ACL.
[17] Jacob Eisenstein,et al. Mimicking Word Embeddings using Subword RNNs , 2017, EMNLP.
[18] Tomas Mikolov,et al. Enriching Word Vectors with Subword Information , 2016, TACL.
[19] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[20] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.