Evaluating DUC 2005 using Basic Elements

In this paper we introduce Basic Elements, a new way of automating the evaluation of text summaries. We show that this method correlates better with human judgments than any other automated procedure to date, and overcomes the subjectivity/variability problems of manual methods that require humans to preprocess summaries to be evaluated. This is demonstrated on DUC 2005 peer systems and peer-produced summaries.