Use cases as a component of information access evaluation

Information access research and development, and information retrieval especially, is based on quantitative and systematic benchmarking. Benchmarking of a computational mechanism is always based on some set of assumptions on how a system with the mechanism under consideration will provide value for its users in concrete situations and those assumptions need to be validated somehow. The valuable effort put into those validation studies is seldom useful for other research or system development projects. This paper argues that use cases for information access can be written to give explicit pointers towards benchmarking mechanisms and that if use cases and hypotheses about user preferences, goals, expectation and satisfaction are made explicit in the design of research systems, they will can more conveniently be validated or disproven -- which in turn makes the results emanating from research efforts more relevant for industrial partners, more sustainable for future research and more portable across projects and studies.

[1]  Tefko Saracevic,et al.  Evaluation of evaluation in information retrieval , 1995, SIGIR '95.

[2]  Gunnar Övergaard,et al.  Use Cases: Patterns and Blueprints , 2004 .

[3]  Nicholas J. Belkin,et al.  Cases, scripts, and information-seeking strategies: On the design of interactive information retrieval systems , 1995 .

[4]  Ivar Jacobson,et al.  Object-oriented development in an industrial environment , 1987, OOPSLA '87.

[5]  Diane Kelly,et al.  Methods for Evaluating Interactive Information Retrieval Systems with Users , 2009, Found. Trends Inf. Retr..

[6]  Gunnar Overgaard,et al.  Use Cases: Patterns and Modeling Problems , 2004 .

[7]  Donald E. Walker Interactive bibliographic search : the user/computer interface , 1971 .

[8]  Ivar Jacobson,et al.  Object-Oriented Software Engineering , 1991, TOOLS.

[9]  Ivar Jacobson,et al.  Object-oriented software engineering - a use case driven approach , 1993, TOOLS.

[10]  Peter Ingwersen,et al.  The Turn - Integration of Information Seeking and Retrieval in Context , 2005, The Kluwer International Series on Information Retrieval.

[11]  Michael Keen,et al.  ASLIB CRANFIELD RESEARCH PROJECT FACTORS DETERMINING THE PERFORMANCE OF INDEXING SYSTEMS VOLUME 2 , 1966 .

[12]  Preben Hansen,et al.  Collaborative Information Retrieval in an information-intensive domain , 2005, Inf. Process. Manag..

[13]  Peter Ingwersen,et al.  The development of a method for the evaluation of interactive information retrieval systems , 1997, J. Documentation.

[14]  JaatunMartin Gilje,et al.  Agile Software Development , 2002, Comput. Sci. Educ..

[15]  Jaana Kekäläinen,et al.  Cumulated gain-based evaluation of IR techniques , 2002, TOIS.

[16]  Cyril W. Cleverdon,et al.  Factors determining the performance of indexing systems , 1966 .

[17]  Nicu Sebe,et al.  CHORUS Deliverable 2.1: State of the Art on Multimedia Search Engines , 2007 .

[18]  Nicu Sebe,et al.  CHORUS Deliverable 2.2: Second report - identification of multi-disciplinary key issues for gap analysis toward EU multimedia search engines roadmap , 2008 .