Learning Numeral Embedding

Word embedding is an essential building block for deep learning methods for natural language processing. Although word embedding has been extensively studied over the years, the problem of how to effectively embed numerals, a special subset of words, is still underexplored. Existing word embedding methods do not learn numeral embeddings well because there are an infinite number of numerals and their individual appearances in training corpora are highly scarce. In this paper, we propose two novel numeral embedding methods that can handle the out-of-vocabulary (OOV) problem for numerals. We first induce a finite set of prototype numerals using either a self-organizing map or a Gaussian mixture model. We then represent the embedding of a numeral as a weighted average of the prototype number embeddings. Numeral embeddings represented in this manner can be plugged into existing word embedding learning approaches such as skip-gram for training. We evaluated our methods and showed its effectiveness on four intrinsic and extrinsic tasks: word similarity, embedding numeracy, numeral prediction, and sequence labeling.

[1]  Eneko Agirre,et al.  Diamonds in the Rough: Event Extraction from Imperfect Microblog Data , 2015, NAACL.

[2]  Christopher D. Manning,et al.  Finding Contradictions in Text , 2008, ACL.

[3]  E. Miller,et al.  Coding of Cognitive Magnitude Compressed Scaling of Numerical Information in the Primate Prefrontal Cortex , 2003, Neuron.

[4]  Wojciech Czarnecki,et al.  How to evaluate word embeddings? On importance of data efficiency and simple supervised tasks , 2017, ArXiv.

[5]  Ming-Wei Chang,et al.  Learning from Explicit and Implicit Supervision Jointly For Algebra Word Problems , 2016, EMNLP.

[6]  Dan Roth,et al.  Reasoning about Quantities in Natural Language , 2015, TACL.

[7]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[8]  Yoshua Bengio,et al.  A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..

[9]  J. Bullinaria,et al.  Extracting semantic representations from word co-occurrence statistics: A computational study , 2007, Behavior research methods.

[10]  Isabelle Augenstein,et al.  Numerically Grounded Language Models for Semantic Error Correction , 2016, EMNLP.

[11]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[12]  Sebastian Riedel,et al.  Numeracy for Language Models: Evaluating and Improving their Ability to Predict Numbers , 2018, ACL.

[13]  Hiroya Takamura,et al.  Numeracy-600K: Learning Numeracy for Detecting Exaggerated Information in Market Comments , 2019, ACL.

[14]  Jason Weston,et al.  A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.

[15]  Shuming Shi,et al.  Deep Neural Solver for Math Word Problems , 2017, EMNLP.

[16]  S. Dehaene,et al.  THREE PARIETAL CIRCUITS FOR NUMBER PROCESSING , 2003, Cognitive neuropsychology.

[17]  Ronan Collobert,et al.  Word Embeddings through Hellinger PCA , 2013, EACL.

[18]  Johannes Blömer,et al.  Simple Methods for Initializing the EM Algorithm for Gaussian Mixture Models , 2013, ArXiv.

[19]  Yue Zhang,et al.  NCRF++: An Open-source Neural Sequence Labeling Toolkit , 2018, ACL.

[20]  Dan Roth,et al.  Solving General Arithmetic Word Problems , 2016, EMNLP.

[21]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[22]  S. Dehaene,et al.  The Number Sense: How the Mind Creates Mathematics. , 1998 .

[23]  Ehud Rivlin,et al.  Placing search in context: the concept revisited , 2002, TOIS.

[24]  Roger Levy,et al.  Solving logic puzzles: From robust processing to precise semantics , 2004, Proceedings of the 2nd Workshop on Text Meaning and Interpretation - TextMean '04.

[25]  Douglas L. T. Rohde,et al.  An Improved Model of Semantic Similarity Based on Lexical Co-Occurrence , 2005 .

[26]  Ganesh Ramakrishnan,et al.  Numerical Relation Extraction with Minimal Supervision , 2016, AAAI.

[27]  Maria Leonor Pacheco,et al.  of the Association for Computational Linguistics: , 2001 .

[28]  Chitta Baral,et al.  Learning To Use Formulas To Solve Simple Arithmetic Problems , 2016, ACL.

[29]  Sameer Singh,et al.  Do NLP Models Know Numbers? Probing Numeracy in Embeddings , 2019, EMNLP.

[30]  Alex Graves,et al.  Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.

[31]  Curt Burgess,et al.  Producing high-dimensional semantic spaces from lexical co-occurrence , 1996 .

[32]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[33]  Carolyn Penstein Rosé,et al.  Exploring Numeracy in Word Embeddings , 2019, ACL.

[34]  Elia Bruni,et al.  Multimodal Distributional Semantics , 2014, J. Artif. Intell. Res..

[35]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[36]  Felix Hill,et al.  SimLex-999: Evaluating Semantic Models With (Genuine) Similarity Estimation , 2014, CL.

[37]  Teuvo Kohonen,et al.  The self-organizing map , 1990 .