Russian Natural Language Generation: Creation of a Language Modelling Dataset and Evaluation with Modern Neural Architectures

Generating coherent, grammatically correct, and meaningful text is very challenging, however, it is crucial to many modern NLP systems. So far, research has mostly focused on English language, for other languages both standardized datasets, as well as experiments with state-of-the-art models, are rare. In this work, we i) provide a novel reference dataset for Russian language modeling, ii) experiment with popular modern methods for text generation, namely variational autoencoders, and generative adversarial networks, which we trained on the new dataset. We evaluate the generated text regarding metrics such as perplexity, grammatical correctness and lexical diversity.

[1]  Thomas Wolf,et al.  Transfer Learning in Natural Language Processing , 2019, NAACL.

[2]  Lukás Burget,et al.  Extensions of recurrent neural network language model , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[3]  Richard Socher,et al.  Pointer Sentinel Mixture Models , 2016, ICLR.

[4]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[5]  Samy Bengio,et al.  Generating Sentences from a Continuous Space , 2015, CoNLL.

[6]  Liubov Nesterenko,et al.  Building a System for Stock News Generation in Russian , 2016, WebNLG.

[7]  Andrew W. Senior,et al.  Long short-term memory recurrent neural network architectures for large scale acoustic modeling , 2014, INTERSPEECH.

[8]  Richard Socher,et al.  Regularizing and Optimizing LSTM Language Models , 2017, ICLR.

[9]  Irina S. Kipyatkova,et al.  A study of neural network Russian language models for automatic continuous speech recognition systems , 2017, Autom. Remote. Control..

[10]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[11]  Lukás Burget,et al.  Empirical Evaluation and Combination of Advanced Language Modeling Techniques , 2011, INTERSPEECH.

[12]  Zhiting Hu,et al.  Improved Variational Autoencoders for Text Modeling using Dilated Convolutions , 2017, ICML.

[13]  Xiaodong Liu,et al.  Cyclical Annealing Schedule: A Simple Approach to Mitigating KL Vanishing , 2019, NAACL.

[14]  Yong Yu,et al.  Long Text Generation via Adversarial Training with Leaked Information , 2017, AAAI.

[15]  Zhi Chen,et al.  Adversarial Feature Matching for Text Generation , 2017, ICML.

[16]  Mikhail Arkhipov,et al.  Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language , 2019, ArXiv.

[17]  Chong Wang,et al.  Reading Tea Leaves: How Humans Interpret Topic Models , 2009, NIPS.

[18]  Claire Gardent,et al.  Creating a Corpus for Russian Data-to-Text Generation Using Neural Machine Translation and Post-Editing , 2019, BSNLP@ACL.

[19]  Andrew M. Dai,et al.  MaskGAN: Better Text Generation via Filling in the ______ , 2018, ICLR.

[20]  Lantao Yu,et al.  SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient , 2016, AAAI.

[21]  Parma Nand,et al.  GENERATION : A SURVEY AND CLASSIFICATION OF THE EMPIRICAL LITERATURE , 2017 .

[22]  Matt J. Kusner,et al.  GANS for Sequences of Discrete Elements with the Gumbel-softmax Distribution , 2016, ArXiv.