Optimizing visual search with implicit user feedback in interactive video retrieval

This paper describes an approach to optimize query by visual example results, by combining visual features and implicit user feedback in interactive video retrieval. To this end, we propose a framework, in which video processing is performed by employing well established techniques, while implicit user feedback analysis is realized with a graph based approach that processes the user actions and navigation patterns during a search session, in order to initiate semantic relations between the video segments. To combine the visual and implicit feedback information, we train a support vector machine classifier with positive and negative examples generated from the graph structured past user interaction data. Then, the classifier reranks the results of visual search that were initially based on visual features. This framework is embedded in an interactive video search engine and evaluated by conducting a user experiment in two phases: first, we record the user actions during typical retrieval sessions and then, we evaluate the reranking of the results of visual query by example. The evaluation and the results demonstrate that the proposed approach provides an improved ranking in most of the evaluated queries.

[1]  Thorsten Joachims,et al.  Training linear SVMs in linear time , 2006, KDD '06.

[2]  Jaime Teevan,et al.  Implicit feedback for inferring user preference: a bibliography , 2003, SIGF.

[3]  Yiannis Kompatsiaris,et al.  COST292 experimental framework for TRECVID2008 , 2008, TRECVID.

[4]  Filip Radlinski,et al.  Query chains: learning to rank from implicit feedback , 2005, KDD '05.

[5]  Thorsten Joachims,et al.  Optimizing search engines using clickthrough data , 2002, KDD.

[6]  Edsger W. Dijkstra,et al.  A note on two problems in connexion with graphs , 1959, Numerische Mathematik.

[7]  Martin F. Porter,et al.  An algorithm for suffix stripping , 1997, Program.

[8]  Alan Hanjalic,et al.  A New Method for Key Frame Based Video Content Representation , 1998, Image Databases and Multi-Media Search.

[9]  Ryen W. White,et al.  The Use of Implicit Evidence for Relevance Feedback in Web Retrieval , 2002, ECIR.

[10]  Mark Claypool,et al.  Implicit interest indicators , 2001, IUI '01.

[11]  K. Sparck Jones,et al.  Simple, proven approaches to text retrieval , 1994 .

[12]  Tao Mei,et al.  Online video recommendation based on multimodal fusion and relevance feedback , 2007, CIVR '07.

[13]  Martin Halvey,et al.  Search trails using user feedback to improve video search , 2008, ACM Multimedia.

[14]  Thorsten Joachims,et al.  A support vector method for multivariate performance measures , 2005, ICML.

[15]  Antonin Guttman,et al.  R-trees: a dynamic index structure for spatial searching , 1984, SIGMOD '84.

[16]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Frank Hopfgartner,et al.  Evaluating the implicit feedback models for adaptive video retrieval , 2007, MIR '07.

[18]  Frank Hopfgartner,et al.  Use of Implicit Graph for Recommending Relevant Videos: A Simulated Evaluation , 2008, ECIR.