Inferring search intents from remote control movement patterns: a new content search method for smart TV

Smart TV has been an increasingly important multimedia device in recent years due to its powerful functionalities that facilitate consuming multimedia contents. In smart TV environment where an interface for controlling multimedia contents is very limited, how to minimize user efforts while interacting with multimedia contents (e.g., searching for a specific video or browsing through images) is the most critical issue. To address this issue, various approaches using voice communication, gesture control, and gaze tracking systems have been proposed; however, their performance is still not satisfactory. In this paper, a new approach for providing multimedia contents to users with minimized human efforts in smart TV environment is proposed. The proposed approach is based on the recent smart TV equipped with a standard mouse as a remote controller and exploits a novel algorithm to infer user's search intent from mouse movement patterns, thereby improving the speed and accuracy of the contents search in smart TV environment. The experimental results show the strength and potentials of the proposed approach and further research directions.

[1]  Kerry Rodden,et al.  Eye-mouse coordination patterns on web search results pages , 2008, CHI Extended Abstracts.

[2]  Thorsten Joachims,et al.  The influence of task and gender on search and evaluation behavior using Google , 2006, Inf. Process. Manag..

[3]  Thomas S. Huang,et al.  Content-based image retrieval with relevance feedback in MARS , 1997, Proceedings of International Conference on Image Processing.

[4]  Kerry Rodden,et al.  Exploring How Mouse Movements Relate to Eye Movements on Web Search Results Pages , 2007 .

[5]  Thorsten Joachims,et al.  Accurately interpreting clickthrough data as implicit feedback , 2005, SIGIR '05.

[6]  Eugene Agichtein,et al.  Exploring mouse movements for inferring query intent , 2008, SIGIR '08.

[7]  Edward Y. Chang,et al.  Nnew: nearest neighbor expansion by weighting in image database retrieval , 2001, IEEE International Conference on Multimedia and Expo, 2001. ICME 2001..

[8]  Zoran Saric,et al.  Hands-free voice communication with TV , 2011, IEEE Transactions on Consumer Electronics.

[9]  Jin-Woo Jeong,et al.  Ontology-based automatic video annotation technique in smart TV environment , 2011, IEEE Transactions on Consumer Electronics.

[10]  Eugene Agichtein,et al.  Ready to buy or just browsing?: detecting web searcher goals from interaction data , 2010, SIGIR.

[11]  Ryen W. White,et al.  User see, user point: gaze and cursor alignment in web search , 2012, CHI.

[12]  Sung-Kwon Park,et al.  User performance measures for evaluating interactive TV pointing devices , 2011, IEEE Transactions on Consumer Electronics.

[13]  Tae Houn Song,et al.  Single-camera dedicated television control system using gesture drawing , 2012, IEEE Transactions on Consumer Electronics.

[14]  Mathias Lux,et al.  Lire: lucene image retrieval: an extensible java CBIR library , 2008, ACM Multimedia.

[15]  Christos Faloutsos,et al.  MindReader: Querying Databases Through Multiple Examples , 1998, VLDB.

[16]  Thorsten Joachims,et al.  Accurately Interpreting Clickthrough Data as Implicit Feedback , 2017 .

[17]  Christos Faloutsos,et al.  FALCON: Feedback Adaptive Loop for Content-Based Retrieval , 2000, VLDB.

[18]  Kang Ryoung Park,et al.  Gaze tracking system at a distance for controlling IPTV , 2010, IEEE Transactions on Consumer Electronics.

[19]  JongHyun Park,et al.  Demand forecasting and strategies for the successfully deployment of the smart TV in Korea , 2011, 13th International Conference on Advanced Communication Technology (ICACT2011).

[20]  Yiannis S. Boutalis,et al.  FCTH: Fuzzy Color and Texture Histogram - A Low Level Feature for Accurate Image Retrieval , 2008, 2008 Ninth International Workshop on Image Analysis for Multimedia Interactive Services.

[21]  Andrea Lockerd Thomaz,et al.  Cheese: tracking mouse movement activity on websites, a tool for user modeling , 2001, CHI Extended Abstracts.

[22]  Albrecht Schmidt,et al.  Knowing the User's Every Move – User Activity Tracking for Website Usability Evaluation and Implicit Interaction , 2006 .

[23]  Thomas S. Huang,et al.  Relevance feedback: a power tool for interactive content-based image retrieval , 1998, IEEE Trans. Circuits Syst. Video Technol..