RU-EVAL-2012: Evaluating Dependency Parsers for Russian

The paper reports on the recent forum RU-EVAL ‒ a new initiative for evaluation of Russian NLP resources, methods and toolkits. It started in 2010 with evaluation o f morphological parsers, and the second event RU-EVAL 2012 (2011-2012) focused on syntactic parsing. Eight participating IT companies and academic institutions submitted their results for cor pus parsing. We discuss the results of this evaluation and describe the so-called “soПt” evaluation principles that allowed us to compare output dependency trees, which varied greatly dep ending on theoretical approaches, parsing methods, tag sets, and dependency orientations p rinciples, adopted by the participants.