Evaluating Heterogeneous Information Access (Position Paper)

Information access is becoming increasingly heterogeneous. We need to better understand the more complex user behaviour within this context so that to properly evaluate search systems dealing with heterogeneous information. In this paper, we review the main challenges associated with evaluating search in this context and propose some avenues to incorporate user aspects into evaluation measures.

[1]  M. de Rijke,et al.  Click model-based information retrieval metrics , 2013, SIGIR.

[2]  Alexander J. Smola,et al.  Measurement and modeling of eye-mouse behavior in the presence of nonlinear page layouts , 2013, WWW.

[3]  Jaime Arguello,et al.  Task complexity, vertical display and user interaction in aggregated search , 2012, SIGIR '12.

[4]  Charles L. A. Clarke,et al.  Time-based calibration of effectiveness measures , 2012, SIGIR '12.

[5]  Tetsuya Sakai,et al.  Summaries, ranked retrieval and sessions: a unified framework for information access evaluation , 2013, SIGIR.

[6]  Maarten de Rijke,et al.  Aggregated search interface preferences in multi-session search tasks , 2013, SIGIR.

[7]  Craig MacDonald,et al.  Aggregated Search Result Diversification , 2011, ICTIR.

[8]  Ben Carterette,et al.  Preference based evaluation measures for novelty and diversity , 2013, SIGIR.

[9]  Robert Villa,et al.  Factors affecting click-through behavior in aggregated search interfaces , 2010, CIKM.

[10]  Joemon M. Jose,et al.  Which Vertical Search Engines are Relevant? Understanding Vertical Relevance Assessments for Web Queries , 2013 .

[11]  Fernando Diaz,et al.  A Methodology for Evaluating Aggregated Search Results , 2011, ECIR.

[12]  Qiang Yang,et al.  Beyond ten blue links: enabling user click modeling in federated web search , 2012, WSDM '12.

[13]  Yiqun Liu,et al.  Incorporating vertical results into search click models , 2013, SIGIR.

[14]  Joemon M. Jose,et al.  Evaluating aggregated search pages , 2012, SIGIR '12.