Multi-Domain Dialogue State Tracking - A Purely Transformer-Based Generative Approach

We investigate the problem of multi-domain Dialogue State Tracking (DST) with open vocabulary. Existing approaches exploit BERT encoder and copy-based RNN decoder, where the encoder first predicts the state operation, and then the decoder generates new slot values. However, in this stacked encoder-decoder structure, the operation prediction objective only affects the BERT encoder and the value generation objective mainly affects the RNN decoder. In this paper, we propose a purely Transformer-based framework that uses BERT as both encoder and decoder. In so doing, the operation prediction objective and the value generation objective can jointly optimize our model for DST. At the decoding step, we re-use the hidden states of the encoder in the self-attention mechanism of the corresponding decoder layer to construct a flat model structure for effective parameter updating. Experimental results show that our approach substantially outperforms the existing state-of-the-art framework, and it also achieves very competitive performance to the best ontology-based approaches.

[1]  Lu Chen,et al.  Towards Universal Dialogue State Tracking , 2018, EMNLP.

[2]  Richard Socher,et al.  Transferable Multi-Domain State Generator for Task-Oriented Dialogue Systems , 2019, ACL.

[3]  Li Zhou,et al.  Multi-domain Dialogue State Tracking as Dynamic Knowledge Graph Enhanced Question Answering , 2019, ArXiv.

[4]  Dilek Z. Hakkani-Tür,et al.  HyST: A Hybrid Approach for Flexible and Accurate Dialogue State Tracking , 2019, INTERSPEECH.

[5]  Yan Zeng,et al.  Multi-Domain Dialogue State Tracking based on State Graph , 2020, ArXiv.

[6]  Lu Chen,et al.  Efficient Context and Schema Fusion Networks for Multi-Domain Dialogue State Tracking , 2020, EMNLP.

[7]  Sungjin Lee,et al.  Jointly Optimizing Diversity and Relevance in Neural Response Generation , 2019, NAACL.

[8]  Stefan Ultes,et al.  MultiWOZ - A Large-Scale Multi-Domain Wizard-of-Oz Dataset for Task-Oriented Dialogue Modelling , 2018, EMNLP.

[9]  Gyuwan Kim,et al.  Efficient Dialogue State Tracking by Selectively Overwriting Memory , 2020, ACL.

[10]  Tsung-Hsien Wen,et al.  Neural Belief Tracker: Data-Driven Dialogue State Tracking , 2016, ACL.

[11]  Qi Hu,et al.  An End-to-end Approach for Handling Unknown Slot Values in Dialogue State Tracking , 2018, ACL.

[12]  Pawel Budzianowski,et al.  Large-Scale Multi-Domain Belief Tracking with Knowledge Sharing , 2018, ACL.

[13]  Philip S. Yu,et al.  Find or Classify? Dual Strategy for Slot-Value Predictions on Multi-Domain Dialog State Tracking , 2019, STARSEM.

[14]  Mihail Eric,et al.  MultiWOZ 2. , 2019 .

[15]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[16]  Dilek Z. Hakkani-Tür,et al.  MultiWOZ 2.1: Multi-Domain Dialogue State Corrections and State Tracking Baselines , 2019, ArXiv.

[17]  Tae-Yoon Kim,et al.  SUMBT: Slot-Utterance Matching for Universal and Scalable Belief Tracking , 2019, ACL.

[18]  Matthew Henderson,et al.  Word-Based Dialog State Tracking with Recurrent Neural Networks , 2014, SIGDIAL Conference.

[19]  Min-Yen Kan,et al.  Sequicity: Simplifying Task-oriented Dialogue Systems with Single Sequence-to-Sequence Architectures , 2018, ACL.

[20]  Zhou Yu,et al.  SAS: Dialogue State Tracking via Slot Attention and Slot Information Sharing , 2020, ACL.

[21]  Lukasz Kaiser,et al.  Generating Wikipedia by Summarizing Long Sequences , 2018, ICLR.

[22]  Richard Socher,et al.  Non-Autoregressive Dialog State Tracking , 2020, ICLR.

[23]  Jie Zhou,et al.  A Contextual Hierarchical Attention Network with Adaptive Objective for Dialogue State Tracking , 2020, ACL.

[24]  Jianmo Ni,et al.  Scalable and Accurate Dialogue State Tracking via Hierarchical Sequence Generation , 2019, EMNLP.

[25]  Dilek Z. Hakkani-Tür,et al.  Dialog State Tracking: A Neural Reading Comprehension Approach , 2019, SIGdial.

[26]  Chi Wang,et al.  Schema-Guided Multi-Domain Dialogue State Tracking with Graph Attention Neural Networks , 2020, AAAI.