Measuring improvement in user search performance resulting from optimal search tips

Web search performance can be improved by either improving the search engine itself or by educating the user to search more efficiently. There is a large amount of literature describing techniques for measuring the former; whereas, improvements resulting from the latter are more difficult to quantify. In this paper we demonstrate an experimental methodology that proves to successfully quantify improvements from user education. The user education in our study is realized in the form of tactical search feature tips that expand user awareness of task-relevant tools and features of the search application. Initially, these tips are presented in an idealized situation: each tip is shown at the same time as the study participants are given a task that is constructed to benefit from the specific tip. However, we also present a follow-up study roughly one week later in which the search tips are no longer presented but the study participants who previously were shown search tips still demonstrate improved search efficiency compared to the control group. This research has implications for search user interface designers and the study of information retrieval systems.

[1]  Manuel A. Pérez-Quiñones,et al.  Factors and Evaluation of Refinding Behaviors , 2006 .

[2]  Stefano Mizzaro,et al.  Strategic help in user interfaces for information retrieval , 2002, J. Assoc. Inf. Sci. Technol..

[3]  Norbert Fuhr,et al.  Adaptive Search Suggestions for Digital Libraries , 2007, ICADL.

[4]  Nicholas J. Belkin,et al.  Relevance Feedback versus Local Context Analysis as Term Suggestion Devices: Rutgers' TREC-8 Interactive Track Experience , 1999, TREC.

[5]  S. Mizzaro Intelligent Interfaces for Information Retrieval A Review , 1996 .

[6]  Gary Marchionini,et al.  Exploratory search , 2006, Commun. ACM.

[7]  Daniel M. Russell,et al.  Query logs alone are not enough , 2007 .

[8]  Iris Xie,et al.  Understanding help seeking within the context of searching digital libraries , 2009, J. Assoc. Inf. Sci. Technol..

[9]  Katriina Byström,et al.  Information and information sources in tasks of varying complexity , 2002, J. Assoc. Inf. Sci. Technol..

[10]  Karl Gyllstrom,et al.  Effects of popularity and quality on the usage of query suggestions during information search , 2010, CHI.

[11]  Ingrid Hsieh-Yee,et al.  Effects of Search Experience and Subject Knowledge on the Search Tactics of Novice and Experienced Searchers. , 1993 .

[12]  Nicholas J. Belkin,et al.  Query reformulation, search performance, and term suggestion devices in question-answering tasks , 2008, IIiX.

[13]  Christoph Hölscher,et al.  Web search behavior of Internet experts and newbies , 2000, Comput. Networks.

[14]  Eric Horvitz,et al.  Patterns of search: analyzing and modeling Web query refinement , 1999 .

[15]  Li Ma,et al.  A Four Group Cross-Over Design for Measuring Irreversible Treatments on Web Search Tasks , 2011, 2011 44th Hawaii International Conference on System Sciences.

[16]  Bernard J. Jansen,et al.  Evaluating the effectiveness of and patterns of interactions with automated searching assistance: Research Articles , 2005 .

[17]  Ard W. Lazonder,et al.  Principles for Designing Web Searching Instruction , 2003, Education and Information Technologies.

[18]  Nicholas J. Belkin,et al.  Measuring Web Search Effectiveness: Rutgers at Interactive TREC , 2004 .

[19]  Juan M. Fernández-Luna,et al.  Teaching and learning in information retrieval , 2009, Information Retrieval.

[20]  Ryen W. White,et al.  Characterizing the influence of domain expertise on web search behavior , 2009, WSDM '09.

[21]  Bernard J. Jansen Using temporal patterns of interactions to design effective automated searching assistance , 2006, CACM.

[22]  Suresh K. Bhavnani,et al.  Important Cognitive Components of Domain-Specific Search Knowledge , 2001, TREC.

[23]  JärvelinKalervo,et al.  Task complexity affects information seeking and use , 1995 .

[24]  Karl Gyllstrom,et al.  A comparison of query and term suggestion features for interactive searching , 2009, SIGIR.

[25]  Bernard J. Jansen,et al.  Evaluating the effectiveness of and patterns of interactions with automated searching assistance , 2005, J. Assoc. Inf. Sci. Technol..

[26]  Jonathan Trevor,et al.  Supporting collaborative learning during information searching , 1995, CSCL.

[27]  Dan Morris,et al.  Investigating the querying and browsing behavior of advanced search engine users , 2007, SIGIR.

[28]  Nicholas J. Belkin,et al.  Braque: Design of an Interface to Support User Interaction in Information Retrieval , 1993, Inf. Process. Manag..

[29]  Barbara M. Wildemuth,et al.  The effects of domain knowledge on search tactic formulation , 2004, J. Assoc. Inf. Sci. Technol..

[30]  Daniel M. Russell,et al.  Assigned tasks are not the same as self-chosen Web search tasks , 2007, 2007 40th Annual Hawaii International Conference on System Sciences (HICSS'07).

[31]  Mark S. Ackerman,et al.  The perfect search engine is not enough: a study of orienteering behavior in directed search , 2004, CHI.

[32]  Neema Moraveji User interface designs to support the social transfer of web search expertise , 2010, SIGIR '10.

[33]  Marcia J. Bates,et al.  Idea tactics , 1980, IEEE Transactions on Professional Communication.

[34]  Jacek Gwizdka,et al.  Revisiting search task difficulty: Behavioral and individual difference measures , 2008, ASIST.

[35]  Marcia J. Bates,et al.  Information search tactics , 1979, J. Am. Soc. Inf. Sci..

[36]  Benjamin Rey,et al.  Generating query substitutions , 2006, WWW '06.

[37]  Kalervo Järvelin,et al.  Task complexity affects information seeking and use , 1995 .