Detecting Relevance during Decision-Making from Eye Movements for UI Adaptation

This paper proposes an approach to detect information relevance during decision-making from eye movements in order to enable user interface adaptation. This is a challenging task because gaze behavior varies greatly across individual users and tasks and ground-truth data is difficult to obtain. Thus, prior work has mostly focused on simpler target-search tasks or on establishing general interest, where gaze behavior is less complex. From the literature, we identify six metrics that capture different aspects of the gaze behavior during decision-making and combine them in a voting scheme. We empirically show, that this accounts for the large variations in gaze behavior and out-performs standalone metrics. Importantly, it offers an intuitive way to control the amount of detected information, which is crucial for different UI adaptation schemes to succeed. We show the applicability of our approach by developing a room-search application that changes the visual saliency of content detected as relevant. In an empirical study, we show that it detects up to 97% of relevant elements with respect to user self-reporting, which allows us to meaningfully adapt the interface, as confirmed by participants. Our approach is fast, does not need any explicit user input and can be applied independent of task and user.

[1]  Arto Klami,et al.  Inferring task-relevant image regions from gaze data , 2010, 2010 IEEE International Workshop on Machine Learning for Signal Processing.

[2]  Mark I. Hwang,et al.  Information dimension, information overload and decision quality , 1999, J. Inf. Sci..

[3]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[4]  Alireza Sahami Shirazi,et al.  Increasing the user's attention on the web: using implicit interaction based on gaze behavior to tailor content , 2012, NordiCHI.

[5]  Samuel Kaski,et al.  GaZIR: gaze-based zooming interface for image retrieval , 2009, ICMI-MLMI '09.

[6]  Andreas Dengel,et al.  Attentive documents: Eye tracking as implicit feedback for information retrieval and beyond , 2012, TIIS.

[7]  Andrew Hollingworth,et al.  Eye Movements During Scene Viewing: An Overview , 1998 .

[8]  Martin Raubal,et al.  Improving map reading with gaze-adaptive legends , 2018, ETRA.

[9]  David Lindlbauer,et al.  Context-Aware Online Adaptation of Mixed Reality Interfaces , 2019, UIST.

[10]  Tilman Deuschel,et al.  On the Importance of Spatial Perception for the Design of Adaptive User Interfaces , 2016, 2016 IEEE 10th International Conference on Self-Adaptive and Self-Organizing Systems (SASO).

[11]  Richard Dewhurst,et al.  Using eye-tracking to trace a cognitive process: Gaze behavior during decision making in a natural environment , 2013 .

[12]  D. Schacter,et al.  Do Amnesics Exhibit Cognitive Dissonance Reduction? The Role of Explicit Memory and Attention in Attitude Change , 2001, Psychological science.

[13]  Roman Rädle,et al.  AdaM: Adapting Multi-User Interfaces for Collaborative Environments in Real-Time , 2018, CHI.

[14]  Jacek Gwizdka,et al.  Relevance Prediction from Eye-movements Using Semi-interpretable Convolutional Neural Networks , 2020, CHIIR.

[15]  Nicholas J. Belkin,et al.  Reading time, scrolling and interaction: exploring implicit sources of user preferences for relevance feedback , 2001, Annual International ACM SIGIR Conference on Research and Development in Information Retrieval.

[16]  John R. Anderson,et al.  Eye tracking the visual search of click-down menus , 1999, CHI '99.

[17]  H. Simon,et al.  Models Of Man : Social And Rational , 1957 .

[18]  Roman Bednarik,et al.  What do you want to do next: a novel approach for intent prediction in gaze-based interaction , 2012, ETRA.

[19]  Otmar Hilliges,et al.  Learning Cooperative Personalized Policies from Gaze Data , 2019, UIST.

[20]  R. Säljö,et al.  Expertise Differences in the Comprehension of Visualizations: a Meta-Analysis of Eye-Tracking Research in Professional Domains , 2011 .

[21]  Päivi Majaranta,et al.  Eye-Tracking Reveals the Personal Styles for Search Result Evaluation , 2005, INTERACT.

[22]  Alexander J. Smola,et al.  Measurement and modeling of eye-mouse behavior in the presence of nonlinear page layouts , 2013, WWW.

[23]  J. E. Russo,et al.  An Eye-Fixation Analysis of Choice Processes for Consumer Nondurables , 1994 .

[24]  Lior Fink,et al.  Do Consumers Make Less Accurate Decisions When They Use Mobiles? , 2019, ICIS.

[25]  D. Pellerin,et al.  Different types of sounds influence gaze differently in videos , 2013 .

[26]  Peter Kiefer,et al.  POITrack: improving map-based planning with implicit POI tracking , 2019, ETRA.

[27]  Feng Liu,et al.  Gaze-based Notetaking for Learning from Lecture Videos , 2016, CHI.

[28]  Jaime Teevan,et al.  Implicit feedback for inferring user preference: a bibliography , 2003, SIGF.

[29]  Jacob L. Orquin,et al.  Attention and choice: a review on eye movements in decision making. , 2013, Acta psychologica.

[30]  Susan T. Dumais,et al.  Individual differences in gaze patterns for web search , 2010, IIiX.

[31]  Samuel Kaski,et al.  Can Relevance be Inferred from Eye Movements in Information Retrieval , 2003 .

[32]  Ian Ruthven,et al.  An eye-tracking approach to the analysis of relevance judgments on the Web: The case of Google search engine , 2012, J. Assoc. Inf. Sci. Technol..

[33]  王立香 What Do You Want to Do? , 2021, The Greatest Lecture I Was Never Taught.

[34]  Paul Dourish,et al.  What we talk about when we talk about context , 2004, Personal and Ubiquitous Computing.

[35]  Concetto Spampinato,et al.  Visual attention for implicit relevance feedback in a content based image retrieval , 2010, ETRA '10.

[36]  Joanna McGrenere,et al.  Ephemeral adaptation: the use of gradual onset to improve menu selection performance , 2009, CHI.

[37]  Adam Herout,et al.  PeepList: Adapting ex-post interaction with pervasive display content using eye tracking , 2016, Pervasive Mob. Comput..

[38]  Frank M. Shipman,et al.  Unified Relevance Feedback for Multi-Application User Interest Modeling , 2015, JCDL.

[39]  Michel Wedel,et al.  Raising the BAR: Bias Adjustment of Recognition Tests in Advertising , 2010 .

[40]  John W. Payne,et al.  A Constructive Process View of Decision Making: Multiple Strategies in Judgment and Choice , 1992 .

[41]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[42]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[43]  Gerd Gigerenzer,et al.  Heuristic decision making. , 2011, Annual review of psychology.

[44]  Samuel Kaski,et al.  Can relevance of images be inferred from eye movements? , 2008, MIR '08.

[45]  Ryen W. White,et al.  Large-scale analysis of individual and task differences in search result page examination strategies , 2012, WSDM '12.

[46]  Samuel Kaski,et al.  Inferring Relevance from Eye Movements: Feature Extraction , 2005 .

[47]  Arzu Çöltekin,et al.  Using Coefficient to Distinguish Ambient/Focal Visual Attention During Cartographic Tasks , 2017, Journal of eye movement research.

[48]  Mary M Hayhoe,et al.  Trade-offs between gaze and working memory use. , 2007, Journal of experimental psychology. Human perception and performance.

[49]  Krzysztof Z. Gajos,et al.  Design Space and Evaluation Challenges of Adaptive Graphical User Interfaces , 2009, AI Mag..

[50]  Ali Borji,et al.  State-of-the-Art in Visual Attention Modeling , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[51]  Jacek Gwizdka,et al.  Characterizing relevance with eye-tracking measures , 2014, IIiX.

[52]  Daniel J. Wigdor,et al.  Panelrama: enabling easy specification of cross-device web applications , 2014, CHI.

[53]  M. Hayhoe,et al.  Adaptive Gaze Control in Natural Environments , 2009, The Journal of Neuroscience.

[54]  R. Polikar,et al.  Ensemble based systems in decision making , 2006, IEEE Circuits and Systems Magazine.

[55]  Andreas Dengel,et al.  Query expansion using gaze-based feedback on the subdocument level , 2008, SIGIR '08.