Chinese Song Iambics Generation with Neural Attention-Based Model

Learning and generating Chinese poems is a charming yet challenging task. Traditional approaches involve various language modeling and machine translation techniques, however, they perform not as well when generating poems with complex pattern constraints, for example Song iambics, a famous type of poems that involve variable-length sentences and strict rhythmic patterns. This paper applies the attention-based sequence-to-sequence model to generate Chinese Song iambics. Specifically, we encode the cue sentences by a bi-directional Long-Short Term Memory (LSTM) model and then predict the entire iambic with the information provided by the encoder, in the form of an attention-based LSTM that can regularize the generation process by the fine structure of the input cues. Several techniques are investigated to improve the model, including global context integration, hybrid style training, character vector initialization and adaptation. Both the automatic and subjective evaluation results show that our model indeed can learn the complex structural and rhythmic patterns of Song iambics, and the generation is rather successful.

[1]  Ruli Manurung,et al.  Using genetic algorithms to create meaningful poetic text , 2012, J. Exp. Theor. Artif. Intell..

[2]  Zhou Chang,et al.  Genetic Algorithm and Its Implementation of Automatic Generation of Chinese SONGCI , 2010 .

[3]  Hugo Gonçalo Oliveira PoeTryMe : a versatile platform for poetry generation , 2012 .

[4]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[5]  Philipp Koehn,et al.  Moses: Open Source Toolkit for Statistical Machine Translation , 2007, ACL.

[6]  Michihiko Minoh,et al.  Hitch Haiku: An Interactive Supporting System for Composing Haiku Poem , 2008, ICEC.

[7]  Long Jiang,et al.  Generating Chinese Classical Poems with Statistical Machine Translation Models , 2012, AAAI.

[8]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[9]  Ryohei Nakatsu,et al.  New Hitch Haiku: An Interactive Renku Poem Composition Supporting Tool Applied for Sightseeing Navigation System , 2009, ICEC.

[10]  Salim Roukos,et al.  Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.

[11]  Yoav Goldberg,et al.  Gaiku : Generating Haiku with Word Associations Norms , 2009 .

[12]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[13]  Matthew D. Zeiler ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.

[14]  Lukás Burget,et al.  Recurrent neural network based language model , 2010, INTERSPEECH.

[15]  H. Manurung An evolutionary algorithm approach to poetry generation , 2004 .

[16]  Walter Daelemans,et al.  Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) , 2014, EMNLP 2014.

[17]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[18]  Chris Callison-Burch,et al.  Open Source Toolkit for Statistical Machine Translation: Factored Translation Models and Lattice Decoding , 2006 .

[19]  Hans Uszkoreit,et al.  Proceedings of the 22nd International Conference on Computational Linguistics - Volume 1 , 2008 .

[20]  Hugo Gonçalo Oliveira Automatic generation of poetry: an overview , 2009 .

[21]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[22]  Mirella Lapata,et al.  Chinese Poetry Generation with Recurrent Neural Networks , 2014, EMNLP.

[23]  Shou-De Lin,et al.  i, Poet: Automatic Chinese Poetry Composition through a Generative Summarization Framework under Constrained Optimization , 2013, IJCAI.

[24]  Long Jiang,et al.  Generating Chinese Couplets using a Statistical MT Approach , 2008, COLING.

[25]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.