Graph-to-Tree Learning for Solving Math Word Problems

While the recent tree-based neural models have demonstrated promising results in generating solution expression for the math word problem (MWP), most of these models do not capture the relationships and order information among the quantities well. This results in poor quantity representations and incorrect solution expressions. In this paper, we propose Graph2Tree, a novel deep learning architecture that combines the merits of the graph-based encoder and tree-based decoder to generate better solution expressions. Included in our Graph2Tree framework are two graphs, namely the Quantity Cell Graph and Quantity Comparison Graph, which are designed to address limitations of existing methods by effectively representing the relationships and order information among the quantities in MWPs. We conduct extensive experiments on two available datasets. Our experiment results show that Graph2Tree outperforms the state-of-the-art baselines on two benchmark datasets significantly. We also discuss case studies and empirically examine Graph2Tree’s effectiveness in translating the MWP text into solution expressions.

[1]  Daisuke Kawahara,et al.  Tree-structured Decoding for Solving Math Word Problems , 2019, EMNLP.

[2]  Mihai Surdeanu,et al.  The Stanford CoreNLP Natural Language Processing Toolkit , 2014, ACL.

[3]  Zhipeng Xie,et al.  A Goal-Driven Tree-Structured Neural Model for Math Word Problems , 2019, IJCAI.

[4]  Zhiyuan Liu,et al.  NumNet: Machine Reading Comprehension with Numerical Reasoning , 2019, EMNLP.

[5]  Wei Lu,et al.  Text2Math: End-to-end Parsing Text into Math Expressions , 2019, EMNLP.

[6]  Yan Wang,et al.  Translating a Math Word Problem to a Expression Tree , 2018, EMNLP.

[7]  Luke S. Zettlemoyer,et al.  Learning to Automatically Solve Algebra Word Problems , 2014, ACL.

[8]  Dongxiang Zhang,et al.  Modeling Intra-Relation in Math Word Problems with Different Functional Multi-Head Attentions , 2019, ACL.

[9]  Oren Etzioni,et al.  Learning to Solve Arithmetic Word Problems with Verb Categorization , 2014, EMNLP.

[10]  Daniel G. Bobrow,et al.  Natural Language Input for a Computer Problem Solving System , 1964 .

[11]  Yun-Nung Chen,et al.  Semantically-Aligned Equation Generation for Solving and Reasoning Math Word Problems , 2018, NAACL.

[12]  Chitta Baral,et al.  Learning To Use Formulas To Solve Simple Arithmetic Problems , 2016, ACL.

[13]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[14]  Mirella Lapata,et al.  Text Generation from Knowledge Graphs with Graph Transformers , 2019, NAACL.

[15]  Shuming Shi,et al.  Automatically Solving Number Word Problems by Semantic Parsing and Reasoning , 2015, EMNLP.

[16]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[17]  Heng Tao Shen,et al.  Template-Based Math Word Problem Solvers with Recursive Neural Networks , 2019, AAAI.

[18]  Shuming Shi,et al.  Learning Fine-Grained Expressions to Solve Math Word Problems , 2017, EMNLP.

[19]  Sophia Ananiadou,et al.  Inter-sentence Relation Extraction with Document-level Graph Convolutional Neural Network , 2019, ACL.

[20]  Oren Etzioni,et al.  Parsing Algebraic Word Problems into Equations , 2015, TACL.

[21]  Dan Roth,et al.  Mapping to Declarative Knowledge for Word Problem Solving , 2017, TACL.

[22]  Dan Roth,et al.  Solving General Arithmetic Word Problems , 2016, EMNLP.

[23]  Hannaneh Hajishirzi,et al.  MAWPS: A Math Word Problem Repository , 2016, NAACL.

[24]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[25]  Deng Cai,et al.  Graph Transformer for Graph-to-Sequence Learning , 2019, AAAI.

[26]  Jian Yin,et al.  Using Intermediate Representations to Solve Math Word Problems , 2018, ACL.

[27]  Wei Lu,et al.  Quantity Tagger: A Latent-Variable Sequence Labeling Approach to Solving Addition-Subtraction Word Problems , 2019, ACL.

[28]  Shuming Shi,et al.  Deep Neural Solver for Math Word Problems , 2017, EMNLP.