User evaluation of ontology as query construction tool

This study examines the use of an ontology as a search tool. Sixteen subjects created queries using Concept-based Information Retrieval Interface (CIRI) and a regular baseline IR interface. The simulated work task method was used to make the searching situations realistic. Subjects’ search experiences, queries and search results were examined. The numbers of search concepts and keys, as well as their overlap in the queries were investigated. The effectiveness of the CIRI and baseline queries was compared. An Ontology Index (OI) was calculated for all search tasks and the correlation between the OI and the overlap of search concepts and keys in queries was investigated. The number of search keys and concepts was higher in CIRI queries than in baseline interface queries. Also the overlap of search keys was higher among CIRI users than among baseline users. These both findings are due to CIRI’s expansion feature. There was no clear correlation between OI and overlap of search concepts and keys. The search results were evaluated with generalised precision and recall, and relevance scores based on individual relevance assessments. The baseline interface queries performed better in all comparisons, but the difference was statistically significant only in relevance scores based on individual relevance assessments.

[1]  Mark Sanderson,et al.  A Study of User Interaction with a Concept-Based Interactive Query Expansion Support Tool , 2004, ECIR.

[2]  Peter Ingwersen,et al.  Users in Context , 2000, ESSIR.

[3]  M. F. Fuller,et al.  Practical Nonparametric Statistics; Nonparametric Statistical Inference , 1973 .

[4]  Jaana Kekäläinen,et al.  ExpansionTool: Concept-Based Query Expansion and Construction , 2001, Information Retrieval.

[5]  Nicholas J. Belkin,et al.  Iterative exploration, design and evaluation of support for query reformulation in interactive information retrieval , 2001, Inf. Process. Manag..

[6]  W. J. Conover,et al.  Practical Nonparametric Statistics , 1972 .

[7]  Alistair G. Sutcliffe,et al.  Empirical studies of end-user information searching , 2000, J. Am. Soc. Inf. Sci..

[8]  Timo Niemi,et al.  A deductive data model for query expansion , 1996, SIGIR '96.

[9]  Jaana Kekäläinen,et al.  The effects of query complexity, expansion and structure on retrieval performance in probabilistic text retrieval , 1999 .

[10]  Pia Borlund,et al.  Evaluation of interactive information retrieval systems , 2000 .

[11]  R. A. Groeneveld,et al.  Practical Nonparametric Statistics (2nd ed). , 1981 .

[12]  Nicola Guarino,et al.  Formal ontology, conceptual analysis and knowledge representation , 1995, Int. J. Hum. Comput. Stud..

[13]  Peter Ingwerswen,et al.  Users in context , 2001 .

[14]  Jaana Kekäläinen,et al.  Using graded relevance assessments in IR evaluation , 2002, J. Assoc. Inf. Sci. Technol..

[15]  Pertti Vakkari,et al.  Subject knowledge improves interactive query expansion assisted by a thesaurus , 2004, J. Documentation.

[16]  Paul Over,et al.  The TREC interactive track: an annotated bibliography , 2001, Inf. Process. Manag..

[17]  James Allan,et al.  INQUERY Does Battle With TREC-6 , 1997, TREC.

[18]  W. Bruce Croft,et al.  Inference networks for document retrieval , 1989, SIGIR '90.

[19]  Stephen E. Robertson,et al.  Interactive Thesaurus Navigation: Intelligence Rules OK? , 1995, J. Am. Soc. Inf. Sci..