Topic-Aware Pointer-Generator Networks for Summarizing Spoken Conversations

Due to the lack of publicly available resources, conversation summarization has received far less attention than text summarization. As the purpose of conversations is to exchange information between at least two interlocutors, key information about a certain topic is often scattered and spanned across multiple utterances and turns from different speakers. This phenomenon is more pronounced during spoken conversations, where speech characteristics such as backchanneling and false-starts might interrupt the topical flow. Moreover, topic diffusion and (intra-utterance) topic drift are also more common in human-to-human conversations. Such linguistic characteristics of dialogue topics make sentence-level extractive summarization approaches used in spoken documents ill-suited for summarizing conversations. Pointer-generator networks have effectively demonstrated its strength at integrating extractive and abstractive capabilities through neural modeling in text summarization. To the best of our knowledge, to date no one has adopted it for summarizing conversations. In this work, we propose a topic-aware architecture to exploit the inherent hierarchical structure in conversations to further adapt the pointer-generator model. Our approach significantly outperforms competitive baselines, achieves more efficient learning outcomes, and attains more robust performance.

[1]  Yun-Nung Chen,et al.  Abstractive Dialogue Summarization with Sentence-Gated Modeling Optimized by Dialogue Acts , 2018, 2018 IEEE Spoken Language Technology Workshop (SLT).

[2]  Masaaki Nagata,et al.  Summarizing a Document by Trimming the Discourse Tree , 2015, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[3]  Chin-Yew Lin,et al.  ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.

[4]  Phil Blunsom,et al.  Teaching Machines to Read and Comprehend , 2015, NIPS.

[5]  Percy Liang,et al.  Adversarial Examples for Evaluating Reading Comprehension Systems , 2017, EMNLP.

[6]  Kai Hong,et al.  Improving the Estimation of Word Importance for News Multi-Document Summarization , 2014, EACL.

[7]  Berlin Chen,et al.  A Hierarchical Neural Summarization Framework for Spoken Documents , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[8]  Kuldip K. Paliwal,et al.  Bidirectional recurrent neural networks , 1997, IEEE Trans. Signal Process..

[9]  Savitha Ramasamy,et al.  Fast Prototyping a Dialogue Comprehension System for Nurse-Patient Conversations on Symptom Monitoring , 2019, NAACL.

[10]  Alexander M. Rush,et al.  Bottom-Up Abstractive Summarization , 2018, EMNLP.

[11]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[12]  Christopher D. Manning,et al.  Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.

[13]  Yang Liu,et al.  Improving supervised learning for meeting summarization using sampling and regression , 2010, Comput. Speech Lang..

[14]  Bowen Zhou,et al.  Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond , 2016, CoNLL.

[15]  Dragomir R. Radev,et al.  LexRank: Graph-based Lexical Centrality as Salience in Text Summarization , 2004, J. Artif. Intell. Res..

[16]  Richard Socher,et al.  A Deep Reinforced Model for Abstractive Summarization , 2017, ICLR.

[17]  Kathleen McKeown,et al.  The decomposition of human-written summary sentences , 1999, SIGIR '99.

[18]  Julia Hirschberg,et al.  Comparing lexical, acoustic/prosodic, structural and discourse features for speech summarization , 2005, INTERSPEECH.

[19]  Min-Yen Kan,et al.  Customization in a unified framework for summarizing medical literature , 2005, Artif. Intell. Medicine.

[20]  Bing Liu,et al.  Bootstrapping a Neural Conversational Agent with Dialogue Self-Play, Crowdsourcing and On-Line Reinforcement Learning , 2018, NAACL.

[21]  E. Schegloff,et al.  A simplest systematics for the organization of turn-taking for conversation , 1974 .

[22]  Yoshua Bengio,et al.  Topic Segmentation : A First Stage to Dialog-Based Information Extraction , 2001, NLPRS.

[23]  Alexander M. Rush,et al.  Abstractive Sentence Summarization with Attentive Recurrent Neural Networks , 2016, NAACL.

[24]  Thierry Poibeau,et al.  Automatic Text Summarization: Past, Present and Future , 2013, Multi-source, Multilingual Information Extraction and Summarization.

[25]  Min Sun,et al.  A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss , 2018, ACL.

[26]  Jason Weston,et al.  A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.

[27]  Pascale Fung,et al.  Improving lecture speech summarization using rhetorical information , 2007, 2007 IEEE Workshop on Automatic Speech Recognition & Understanding (ASRU).

[28]  Bowen Zhou,et al.  SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents , 2016, AAAI.

[29]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[30]  Christopher D. Manning,et al.  Get To The Point: Summarization with Pointer-Generator Networks , 2017, ACL.

[31]  Navdeep Jaitly,et al.  Pointer Networks , 2015, NIPS.

[32]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[33]  Mirella Lapata,et al.  Ranking Sentences for Extractive Summarization with Reinforcement Learning , 2018, NAACL.

[34]  Kathleen McKeown,et al.  Content Selection in Deep Learning Models of Summarization , 2018, EMNLP.