FutureView: Enhancing Exploratory Image Search

Search algorithms in image retrieval tend to focus on giving the user more and more similar images based on queries that the user has to explicitly formulate. Implicitly, such systems limit the users exploration of the image space and thus remove the potential for serendipity. As a response, in recent years there has been an increased interest in developing content based image retrieval systems that allow the user to explore the image space without the need to type specific search queries. However, most of the research focuses on designing new algorithms and techniques, while little research has been done in designing interfaces allowing the user to actively engage in directing their image search. We present an interactive FutureView interface that can be easily combined with most existing exploratory image search engines. The interface gives the user a view of possible future search iterations. A task-based user study demonstrates that our interface enhances exploratory image search by providing access to more images without increasing the time required to find a specific image.

[1]  James Ze Wang,et al.  Content-based image retrieval: approaches and trends of the new age , 2005, MIR '05.

[2]  Xiaoou Tang,et al.  Real time google and live image search re-ranking , 2008, ACM Multimedia.

[3]  Kevin Li,et al.  Faceted metadata for image search and browsing , 2003, CHI '03.

[4]  Shih-Fu Chang,et al.  Visual islands: intuitive browsing of visual search results , 2008, CIVR '08.

[5]  Thomas S. Huang,et al.  Group-based interface for content-based image retrieval , 2002, AVI '02.

[6]  John Shawe-Taylor,et al.  Content-based Image Retrieval with Multinomial Relevance Feedback , 2010, ACML.

[7]  Desney S. Tan,et al.  CueFlik: interactive concept learning in image search , 2008, CHI.

[8]  Thomas S. Huang,et al.  Relevance feedback in image retrieval: A comprehensive review , 2003, Multimedia Systems.

[9]  Xin Fu,et al.  Elicitation of term relevance feedback: an investigation of term source and context , 2006, SIGIR.

[10]  Kristian Kersting,et al.  Beyond 2D-grids: a dependence maximization view on image browsing , 2010, MIR '10.

[11]  Ingemar J. Cox,et al.  The Bayesian image retrieval system, PicHunter: theory, implementation, and psychophysical experiments , 2000, IEEE Trans. Image Process..

[12]  Dorota Glowacka,et al.  A Reinforcement Learning Approach to Query-Less Image Retrieval , 2014, Symbiotic.

[13]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[14]  Robert Villa,et al.  A faceted interface for multimedia search , 2008, SIGIR '08.

[15]  Ed Huai-hsin Chi Blurring of the Boundary Between Interactive Search and Recommendation , 2015, IUI.

[16]  Harald Kosch,et al.  Content-Based Image Retrieval Systems - Reviewing and Benchmarking , 2010, J. Digit. Inf. Manag..

[17]  Bart Thomee,et al.  New trends and ideas in visual concept detection: the MIR flickr retrieval evaluation initiative , 2010, MIR '10.

[18]  François Fleuret,et al.  Iterative relevance feedback with adaptive exploration/exploitation trade-off , 2012, CIKM '12.

[19]  Minglun Gong,et al.  Organizing and Browsing Image Search Results Based on Conceptual and Visual Similarities , 2010, ISVC.

[20]  Desney S. Tan,et al.  Designing Novel Image Search Interfaces by Understanding Unique Characteristics and Usage , 2009, INTERACT.

[21]  Bart Thomee,et al.  Interactive search in image retrieval: a survey , 2012, International Journal of Multimedia Information Retrieval.

[22]  Xing Xie,et al.  Effective browsing of web image search results , 2004, MIR '04.

[23]  Erkki Oja,et al.  PicSOM - content-based image retrieval with self-organizing maps , 2000, Pattern Recognit. Lett..

[24]  Lei Zhang,et al.  IGroup: presenting web image search results in semantic clusters , 2007, CHI.