i-Parser: Interactive Parser Development Kit for Natural Language Processing

This demonstration paper presents i-Parser, a novel development kit that produces high-performance semantic parsers. iParser converts training graphs into sequences written in a context-free language, then our proposed model learns to generate the sequences. With interactive configuration and visualization, users can easily build their own parsers. Benchmark results of i-Parser showed high performances of various parsing tasks in natural language processing.

[1]  Takuya Akiba,et al.  Optuna: A Next-generation Hyperparameter Optimization Framework , 2019, KDD.

[2]  Ee-Peng Lim,et al.  Teacher-Student Networks with Multiple Decoders for Solving Math Word Problem , 2020, IJCAI.

[3]  Natalia Gimelshein,et al.  PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.

[4]  Omer Levy,et al.  RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.

[5]  Stephan Oepen,et al.  Discriminant-Based MRS Banking , 2006, LREC.

[6]  Hiroaki Ozaki,et al.  Hitachi at MRP 2020: Text-to-Graph-Notation Transducer , 2020, CoNLL Shared Task.

[7]  Johan Bos,et al.  MRP 2020: The Second Shared Task on Cross-Framework and Cross-Lingual Meaning Representation Parsing , 2020, CONLL.

[8]  Lysandre Debut,et al.  HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.

[9]  Yan Wang,et al.  Graph-to-Tree Learning for Solving Math Word Problems , 2020, ACL.

[10]  Shuming Shi,et al.  Deep Neural Solver for Math Word Problems , 2017, EMNLP.

[11]  Alexander Sergeev,et al.  Horovod: fast and easy distributed deep learning in TensorFlow , 2018, ArXiv.

[12]  Johan Bos,et al.  The Groningen Meaning Bank , 2013, JSSP.

[13]  Philipp Koehn,et al.  Abstract Meaning Representation for Sembanking , 2013, LAW@ACL.

[14]  Timothy Dozat,et al.  Simpler but More Accurate Semantic Dependency Parsing , 2018, ACL.

[15]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[16]  Marie Mikulová,et al.  Announcing Prague Czech-English Dependency Treebank 2.0 , 2012, LREC.

[17]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[18]  Bin Li,et al.  Annotating the Little Prince with Chinese AMRs , 2016, LAW@ACL.