The CLEF 2001 Interactive Track

The problem of finding documents written in a language that the searcher cannot read is perhaps the most challenging appli- cation of cross-language information retrieval technology. In interactive applications, that task involves at least two steps: (1) the machine lo- cates promising documents in a collection that is larger than the searcher could scan, and (2) the searcher recognizes documents relevant to their intended use from among those nominated by the machine. The goal of the 2001 Cross-Language Evaluation Forum's experimental interactive track was to explore the ability of present technology to support inter- active relevance assessment. This paper describes the shared experiment design used at all three participating sites, summarizes preliminary re- sults from the evaluation, and concludes with observations on lessons learned that can inform the design of subsequent evaluation campaigns.