ZPR2: Joint Zero Pronoun Recovery and Resolution using Multi-Task Learning and BERT

Zero pronoun recovery and resolution aim at recovering the dropped pronoun and pointing out its anaphoric mentions, respectively. We propose to better explore their interaction by solving both tasks together, while the previous work treats them separately. For zero pronoun resolution, we study this task in a more realistic setting, where no parsing trees or only automatic trees are available, while most previous work assumes gold trees. Experiments on two benchmarks show that joint modeling significantly outperforms our baseline that already beats the previous state of the arts.

[1]  Xiaodong Liu,et al.  Multi-Task Deep Neural Networks for Natural Language Understanding , 2019, ACL.

[2]  Bowen Zhou,et al.  Enlisting the Ghost: Modeling Empty Categories for Machine Translation , 2013, ACL.

[3]  Andy Way,et al.  A Novel Approach to Dropped Pronoun Translation , 2016, NAACL.

[4]  Fang Kong,et al.  A Tree Kernel-Based Unified Framework for Chinese Zero Anaphora Resolution , 2010, EMNLP.

[5]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[6]  Danqi Chen,et al.  CoQA: A Conversational Question Answering Challenge , 2018, TACL.

[7]  Chen Chen,et al.  Chinese Zero Pronoun Resolution: Some Recent Advances , 2013, EMNLP.

[8]  Gabi Rolih,et al.  Applying Coreference Resolution for Usage in Dialog Systems , 2018 .

[9]  Daniel Gildea,et al.  Effects of Empty Categories on Machine Translation , 2010, EMNLP.

[10]  Chen Chen,et al.  Chinese Zero Pronoun Resolution with Deep Neural Networks , 2016, ACL.

[11]  Yu Zhang,et al.  Zero Pronoun Resolution with Attention-based Neural Network , 2018, COLING.

[12]  Allyson Ettinger,et al.  Dialogue focus tracking for zero pronoun resolution , 2015, NAACL.

[13]  Jun Guo,et al.  Recovering dropped pronouns in Chinese conversations via modeling their referents , 2019, NAACL.

[14]  Eunsol Choi,et al.  QuAC: Question Answering in Context , 2018, EMNLP.

[15]  Yu Zhang,et al.  Deep Reinforcement Learning for Chinese Zero Pronoun Resolution , 2018, ACL.

[16]  Masaaki Nagata,et al.  Integrating empty category detection into preordering Machine Translation , 2016, WAT@COLING.

[17]  Satoshi Shirai,et al.  Anaphora Resolution of Japanese Zero Pronouns with Deictic Reference , 1996, COLING.

[18]  Yu-Hsin Chen,et al.  Character Identification on Multiparty Conversation: Identifying Mentions of Characters in TV Shows , 2016, SIGDIAL Conference.

[19]  Daisuke Kawahara,et al.  A Fully-Lexicalized Probabilistic Model for Japanese Zero Anaphora Resolution , 2008, COLING.

[20]  Noah A. Smith,et al.  Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) , 2016, ACL 2016.

[21]  Claire Cardie,et al.  DREAM: A Challenge Data Set and Models for Dialogue-Based Reading Comprehension , 2019, TACL.

[22]  Yalin Liu,et al.  Recovering dropped pronouns from Chinese text messages , 2015, ACL.

[23]  Jinho D. Choi,et al.  Robust Coreference Resolution and Entity Linking on Dialogues: Character Identification on TV Show Transcripts , 2017, CoNLL.

[24]  Omer Levy,et al.  GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding , 2018, BlackboxNLP@EMNLP.

[25]  Massimo Poesio,et al.  A Cross-Lingual ILP Solution to Zero Anaphora Resolution , 2011, ACL.

[26]  Jian Zhang,et al.  SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.

[27]  Weinan Zhang,et al.  Chinese Zero Pronoun Resolution with Deep Memory Network , 2017, EMNLP.

[28]  Weinan Zhang,et al.  Neural recovery machine for Chinese dropped pronoun , 2019, Frontiers of Computer Science.

[29]  Ting Liu,et al.  Generating and Exploiting Large-scale Pseudo Training Data for Zero Pronoun Resolution , 2016, ACL.

[30]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.