Evaluating interactive bibliographic information retrieval systems: A user‐centered approach

As an integral part of research activities, scholars frequently interact with bibliographic information retrieval systems to acquire scholarly information. With the dramatic increase of scientific literature, there is the pressing need of building effective and efficient bibliographic information retrieval systems that support more granular and complex bibliographic information needs. Investigating users' interaction with these systems plays an important role in understanding the systems' features and usability issues, which contributes to better system design and implementation. In this paper, we evaluate three interactive bibliographic information retrieval systems proposed in our previous studies using a user‐centered approach, including form‐, natural language‐, and visual graph‐based systems. This study recruits 20 participants to evaluate the three systems from the aspects of success rate, search time, query size, usefulness, ease of use, ease of learning, and satisfaction. Results showed that: the form‐based system needed the least time to formulate simple bibliographic queries and allowed the participants to make the least mistakes; the natural language‐based system needed the least time to formulate high‐complexity bibliographic queries and was rated as the most useful, easy‐to‐use, and easy‐to‐learn system; and the visual graph‐based system's strengths lie in the support of complex queries as shown that it got better ratings of usefulness and satisfaction by the high‐complexity task group.

[1]  Elaine Toms,et al.  Building a Common Framework for IIR Evaluation , 2013, CLEF.

[2]  Il-Yeol Song,et al.  A natural language interface to a graph-based bibliographic information retrieval system , 2016, Data Knowl. Eng..

[3]  Il-Yeol Song,et al.  The use of a graph‐based system to improve bibliographic information retrieval: System design, implementation, and evaluation , 2017, J. Assoc. Inf. Sci. Technol..

[4]  Mark Sanderson,et al.  The effect of user characteristics on search effectiveness in information retrieval , 2011, Inf. Process. Manag..

[5]  Pia Borlund,et al.  Order effect in interactive information retrieval evaluation: an empirical study , 2016, J. Documentation.

[6]  Luanne Freund,et al.  Assigning search tasks designed to elicit exploratory search behaviors , 2012, HCIR '12.

[7]  Gerard Salton,et al.  Evaluation problems in interactive information retrieval , 1969, Inf. Storage Retr..

[8]  Pia Borlund,et al.  The IIR evaluation model: a framework for evaluation of interactive information retrieval systems , 2003, Inf. Res..

[9]  Jacek Gwizdka,et al.  Search behaviors in different task types , 2010, JCDL '10.

[10]  Diane Kelly,et al.  Methods for Evaluating Interactive Information Retrieval Systems with Users , 2009, Found. Trends Inf. Retr..

[11]  Ergonomic requirements for office work with visual display terminals ( VDTs ) — Part 11 : Guidance on usability , 1998 .

[12]  Hélène de Ribaupierre Precise information retrieval in semantic scientific digital libraries , 2012 .

[13]  Yuelin Li,et al.  Exploring the relationships between work task and search task in information search , 2009, J. Assoc. Inf. Sci. Technol..

[14]  Mika Käki,et al.  Controlling the complexity in comparing search user interfaces via user studies , 2008, Information Processing & Management.

[15]  Nicholas J. Belkin,et al.  A faceted approach to conceptualizing tasks in information seeking , 2008, Inf. Process. Manag..

[16]  Jacek Gwizdka,et al.  YASFIIRE: yet another system for IIR evaluation , 2014, IIiX.

[17]  Erjia Yan,et al.  Searching bibliographic data using graphs: A visual graph query interface , 2016, J. Informetrics.

[18]  Jaime Arguello,et al.  Development and Evaluation of Search Tasks for IIR Experiments using a Cognitive Complexity Framework , 2015, ICTIR.

[19]  Pia Borlund,et al.  A study of the use of simulated work task situations in interactive information retrieval evaluations: A meta-evaluation , 2016, J. Documentation.

[20]  Peter Ingwersen,et al.  The development of a method for the evaluation of interactive information retrieval systems , 1997, J. Documentation.

[21]  Leif Azzopardi,et al.  SCAMP: a tool for conducting interactive information retrieval experiments , 2012, IIiX.

[22]  Elaine Toms,et al.  Untangling search task complexity and difficulty in the context of interactive information retrieval studies , 2014, J. Documentation.

[23]  Maryam Okhovati,et al.  Novice and experienced users’ search performance and satisfaction with Web of Science and Scopus , 2017, J. Libr. Inf. Sci..

[24]  Pia Borlund,et al.  Reconsideration of the simulated work task situation: a context instrument for evaluation of information retrieval interaction , 2010, IIiX.

[25]  Cassidy R. Sugimoto,et al.  A systematic review of interactive information retrieval evaluation studies, 1967-2006 , 2013, J. Assoc. Inf. Sci. Technol..

[26]  Emine Yilmaz,et al.  Crowdsourcing interactions: using crowdsourcing for evaluating interactive information retrieval systems , 2012, Information Retrieval.

[27]  Nicholas J. Belkin,et al.  An exploration of the relationships between work task and interactive information search behavior , 2010, J. Assoc. Inf. Sci. Technol..

[29]  Mark Sanderson,et al.  A review of factors influencing user satisfaction in information retrieval , 2010, J. Assoc. Inf. Sci. Technol..

[30]  Kasper Hornbæk,et al.  Current practice in measuring usability: Challenges to usability studies and research , 2006, Int. J. Hum. Comput. Stud..

[31]  Oscar Mauricio Serrano Jaimes,et al.  EVALUACION DE LA USABILIDAD EN SITIOS WEB, BASADA EN EL ESTANDAR ISO 9241-11 (International Standard (1998) Ergonomic requirements For office work with visual display terminals (VDTs)-Parts II: Guidance on usability , 2012 .