Learning Diachronic Analogies to Analyze Concept Change

We propose to study the evolution of concepts by learning to complete diachronic analogies between lists of terms which relate to the same concept at different points in time. We present a number of models based on operations on word embedddings that correspond to different assumptions about the characteristics of diachronic analogies and change in concept vocabularies. These are tested in a quantitative evaluation for nine different concepts on a corpus of Dutch newspapers from the 1950s and 1980s. We show that a model which treats the concept terms as analogous and learns weights to compensate for diachronic changes (weighted linear combination) is able to more accurately predict the missing term than a learned transformation and two baselines for most of the evaluated concepts. We also find that all models tend to be coherent in relation to the represented concept, but less discriminative in regard to other concepts. Additionally, we evaluate the effect of aligning the time-specific embedding spaces using orthogonal Procrustes, finding varying effects on performance, depending on the model, concept and evaluation metric. For the weighted linear combination, however, results improve with alignment in a majority of cases. All related code is released publicly.

[1]  Jure Leskovec,et al.  Diachronic Word Embeddings Reveal Statistical Laws of Semantic Change , 2016, ACL.

[2]  M. de Rijke,et al.  Ad Hoc Monitoring of Vocabulary Shifts over Time , 2015, CIKM.

[3]  Noah A. Smith,et al.  Friendships, Rivalries, and Trysts: Characterizing Relations between Ideas in Texts , 2017, ACL.

[4]  J. R. Firth,et al.  A Synopsis of Linguistic Theory, 1930-1955 , 1957 .

[5]  P. Schönemann,et al.  A generalized solution of the orthogonal procrustes problem , 1966 .

[6]  Peter D. Turney Similarity of Semantic Relations , 2006, CL.

[7]  Slav Petrov,et al.  Temporal Analysis of Language through Neural Language Models , 2014, LTCSS@ACL.

[8]  Geoffrey Zweig,et al.  Linguistic Regularities in Continuous Space Word Representations , 2013, NAACL.

[9]  Omer Levy,et al.  Linguistic Regularities in Sparse and Explicit Word Representations , 2014, CoNLL.

[10]  Ellen M. Voorhees,et al.  The TREC-8 Question Answering Track Report , 1999, TREC.

[11]  Steven Skiena,et al.  Statistically Significant Detection of Linguistic Change , 2014, WWW.

[12]  Evgeniy Gabrilovich,et al.  A word at a time: computing word relatedness using temporal semantic analysis , 2011, WWW.

[13]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[14]  Paul Nulty,et al.  Tracing Shifting Conceptual Vocabularies Through Time , 2016, Drift-a-LOD@EKAW.

[15]  Antoine Cornuéjols,et al.  Analogy and Induction : which (missing) link ? , 1998 .

[16]  Zellig S. Harris,et al.  Distributional Structure , 1954 .

[17]  Marco Baroni,et al.  A distributional similarity approach to the detection of semantic change in the Google Books Ngram corpus. , 2011, GEMS.

[18]  Stephan Mandt,et al.  Dynamic Word Embeddings , 2017, ICML.

[19]  Daphna Weinshall,et al.  Outta Control: Laws of Semantic Change and Inherent Biases in Word Representation Models , 2017, EMNLP.

[20]  Omer Levy,et al.  Improving Distributional Similarity with Lessons Learned from Word Embeddings , 2015, TACL.

[21]  Philipp Cimiano,et al.  Learning Compositionality Functions on Word Embeddings for Modelling Attribute Meaning in Adjective-Noun Phrases , 2017, EACL.

[22]  Thomas L. Griffiths,et al.  Evaluating vector-space models of analogy , 2017, CogSci.

[23]  Melvin Wevers,et al.  Design and Implementation of ShiCo: Visualising Shifting Concepts over Time , 2016, HistoInformatics@DH.

[24]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[25]  John D. Lafferty,et al.  Dynamic topic models , 2006, ICML.

[26]  D. Rumelhart,et al.  A model for analogical reasoning. , 1973 .