Unsupervised Dependency Parsing: Let’s Use Supervised Parsers

We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called ‘iterated reranking’ (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8% accuracy higher than the stateof-the-part parser of Spitkovsky et al. (2013) on the WSJ corpus.

[1]  Danqi Chen,et al.  A Fast and Accurate Dependency Parser using Neural Networks , 2014, EMNLP.

[2]  Yuji Matsumoto,et al.  Efficient Stacked Dependency Parsing by Forest Reranking , 2013, Transactions of the Association for Computational Linguistics.

[3]  Milan Straka,et al.  Stop-probability estimates computed on a large corpus improve Unsupervised Dependency Parsing , 2013, ACL.

[4]  Dan Klein,et al.  Corpus-Based Induction of Syntactic Structure: Models of Dependency and Constituency , 2004, ACL.

[5]  Phong Le,et al.  The Inside-Outside Recursive Neural Network model for Dependency Parsing , 2014, EMNLP.

[6]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[7]  Valentin I. Spitkovsky,et al.  Breaking Out of Local Optima with Count Transforms and Model Recombination: A Study in Grammar Induction , 2013, EMNLP.

[8]  Kevin Gimpel,et al.  Tailoring Continuous Word Representations for Dependency Parsing , 2014, ACL.

[9]  Fernando Pereira,et al.  Online Learning of Approximate Dependency Parsing Algorithms , 2006, EACL.

[10]  Michael Collins,et al.  Head-Driven Statistical Models for Natural Language Parsing , 2003, CL.

[11]  Valentin I. Spitkovsky,et al.  Bootstrapping Dependency Grammar Inducers from Incomplete Sentence Fragments via Austere Models , 2012, ICGI.

[12]  Eugene Charniak,et al.  Effective Self-Training for Parsing , 2006, NAACL.

[13]  Eugene Charniak,et al.  Coarse-to-Fine n-Best Parsing and MaxEnt Discriminative Reranking , 2005, ACL.

[14]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[15]  Rens Bod,et al.  A generative re-ranking model for dependency parsing , 2009, IWPT.

[16]  Valentin I. Spitkovsky,et al.  Viterbi Training Improves Unsupervised Dependency Parsing , 2010, CoNLL.

[17]  Kewei Tu,et al.  Unambiguity Regularization for Unsupervised Learning of Probabilistic Grammars , 2012, EMNLP.

[18]  Noah A. Smith,et al.  Turning on the Turbo: Fast Third-Order Non-Projective Turbo Parsers , 2013, ACL.

[19]  Yonatan Bisk,et al.  Simple Robust Grammar Induction with Combinatory Categorial Grammars , 2012, AAAI.

[20]  Tahira Naseem Linguistically motivated models for lightly-supervised dependency parsing , 2014 .

[21]  Christopher D. Manning,et al.  Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks , 2010 .

[22]  Ben Taskar,et al.  Posterior Sparsity in Unsupervised Dependency Parsing , 2011, J. Mach. Learn. Res..

[23]  Valentin I. Spitkovsky,et al.  From Baby Steps to Leapfrog: How “Less is More” in Unsupervised Dependency Parsing , 2010, NAACL.

[24]  Jason Weston,et al.  Natural Language Processing (Almost) from Scratch , 2011, J. Mach. Learn. Res..

[25]  Avrim Blum,et al.  The Bottleneck , 2021, Monopsony Capitalism.

[26]  Jason Eisner,et al.  Three New Probabilistic Models for Dependency Parsing: An Exploration , 1996, COLING.

[27]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[28]  J. Elman Learning and development in neural networks: the importance of starting small , 1993, Cognition.

[29]  Phil Blunsom,et al.  Unsupervised Induction of Tree Substitution Grammars for Dependency Parsing , 2010, EMNLP.

[30]  Regina Barzilay,et al.  Using Semantic Cues to Learn Syntax , 2011, AAAI.

[31]  Michael Collins,et al.  Discriminative Reranking for Natural Language Parsing , 2000, CL.