PeepList: Adapting ex-post interaction with pervasive display content using eye tracking

Abstract Short intensive interactions with unfamiliar pervasive displays coerce users to perform cognitive operations with uncertainty and risk of not being able to access the information of relevance later. We developed a new way of interaction with pervasive displays by harnessing the eye tracking technology to extract information that is most likely relevant to the user. These extracted bits of important information are presented to the user and sorted according to their estimated importance—in the PeepList. The users can interact with the PeepList without explicit commands and they can access the customized PeepList ex-post in order to review information previously consumed from the pervasive display. We carried out a user study involving 16 participants to evaluate the contribution of PeepList to efficient pervasive display interaction. The tests revealed that the PeepList system is unobtrusive, accurate, and in particular it reduces the interaction times by 40% when complex tasks were presented to the participants. A feasible user model can be built in under 30 seconds in 50% of all interactions, and in one minute, a majority of all interactions (70%) lead to a useful user model. Experimental results show that eye tracking is a valuable real-time implicit source of information about what the user is searching for on a pervasive display and that it can be used for real-time user interface adaptation. This considerably improves the efficiency of obtaining and retaining required data.

[1]  Alireza Sahami Shirazi,et al.  Interaction techniques for creating and exchanging content with public displays , 2013, CHI.

[2]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[3]  Saso Koceski,et al.  Evaluating an Ordered List of Recommended Physical Activities within Health Care System , 2014, ICT Innovations.

[4]  Roman Bednarik,et al.  Inferring problem solving strategies using eye-tracking: system description and evaluation , 2010, Koli Calling.

[5]  Andreas Dengel,et al.  Query expansion using gaze-based feedback on the subdocument level , 2008, SIGIR '08.

[6]  Jason I. Hong,et al.  Contextual web history: using visual and contextual cues to improve web browser history , 2009, CHI.

[7]  Albrecht Schmidt,et al.  Open Display Networks: A Communications Medium for the 21st Century , 2012, Computer.

[8]  Virpi Roto,et al.  Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI , 2005, CHI.

[9]  Fionn Murtagh,et al.  Computer display control and interaction using eye‐gaze , 2002 .

[10]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[11]  M. Sheelagh T. Carpendale,et al.  Interactive Public Displays , 2013, IEEE Computer Graphics and Applications.

[12]  Antonis A. Argyros,et al.  Navigation assistance and guidance of older adults across complex public spaces: the DALi approach , 2015, Intell. Serv. Robotics.

[13]  Keith S. Karn,et al.  Commentary on Section 4. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. , 2003 .

[14]  Paul Dourish,et al.  From interaction to performance with public displays , 2014, Personal and Ubiquitous Computing.

[15]  Samuel Kaski,et al.  Learning relevance from natural eye movements in pervasive interfaces , 2012, ICMI '12.

[16]  Andreas Dengel,et al.  Attentive documents: Eye tracking as implicit feedback for information retrieval and beyond , 2012, TIIS.

[17]  Charles E. Perkins,et al.  Mobile Networking Through Mobile IP , 1998, IEEE Internet Comput..

[18]  Tie-Yan Liu,et al.  A Theoretical Analysis of NDCG Type Ranking Measures , 2013, COLT.

[19]  Michael Rohs,et al.  Sweep and point and shoot: phonecam-based interactions for large public displays , 2005, CHI Extended Abstracts.

[20]  Andreas Butz,et al.  Touch projector: mobile interaction through video , 2010, CHI.

[21]  Daniel Roggen,et al.  Recognition of visual memory recall processes using eye movement analysis , 2011, UbiComp '11.

[22]  Fabio Paternò,et al.  A logical framework for multi-device user interfaces , 2012, EICS '12.

[23]  Timo Ojala,et al.  Gesture interaction for wall-sized touchscreen display , 2013, UbiComp.

[24]  Yang Li,et al.  Deep shot: a framework for migrating tasks across devices using mobile phone cameras , 2011, CHI.

[25]  W. Bruce Croft,et al.  Search Engines - Information Retrieval in Practice , 2009 .

[26]  Gunnar Hauland Measuring Individual and Team Situation Awareness During Planning Tasks in Training of En Route Air Traffic Control , 2008 .

[27]  Jari Arkko,et al.  Using IPsec to Protect Mobile IPv6 Signaling Between Mobile Nodes and Home Agents , 2004, RFC.

[28]  Jacek Gwizdka,et al.  Inferring user knowledge level from eye movement patterns , 2013, Inf. Process. Manag..

[29]  Sarah Clinch,et al.  Smartphones and Pervasive Public Displays , 2013, IEEE Pervasive Computing.

[30]  Marc Langheinrich,et al.  Personalisation and privacy in future pervasive display networks , 2014, CHI.

[31]  Claudia Mello-Thoms,et al.  What attracts the eye to the location of missed and reported breast cancers? , 2002, ETRA.

[32]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[33]  Roman Bednarik,et al.  What do you want to do next: a novel approach for intent prediction in gaze-based interaction , 2012, ETRA.

[34]  Fabio Paternò,et al.  Security in migratory interactive web applications , 2012, MUM.

[35]  Jörg Müller,et al.  GazeHorizon: enabling passers-by to interact with public displays by gaze , 2014, UbiComp.

[36]  Norbert Pachler,et al.  Researching mobile learning: Frameworks, tools and research designs , 2009 .

[37]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[38]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[39]  Andreas Butz,et al.  Multi-user interaction on media facades through live video on mobile devices , 2011, CHI.

[40]  Alireza Sahami Shirazi,et al.  Increasing the user's attention on the web: using implicit interaction based on gaze behavior to tailor content , 2012, NordiCHI.

[41]  Florian Alt,et al.  The interacting places framework: conceptualizing public display applications that promote community interaction and place awareness , 2012, PerDis '12.

[42]  Luciana Porcher Nedel,et al.  Deviceless Gestural Interaction for Public Displays , 2013, 2013 XV Symposium on Virtual and Augmented Reality.

[43]  Rui José,et al.  Beyond interaction: tools and practices for situated publication in display networks , 2012, PerDis '12.

[44]  Dieter Schmalstieg,et al.  The utility of Magic Lens interfaces on handheld devices for touristic map navigation , 2015, Pervasive Mob. Comput..

[45]  Dieter Schmalstieg,et al.  Robust and unobtrusive marker tracking on mobile phones , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.