Identification of human implicit visual search intention based on eye movement and pupillary analysis

We propose a novel approach for the identification of human implicit visual search intention based on eye movement patterns and pupillary analysis, in general, as well as pupil size, gradient of pupil size variation, fixation length and fixation count corresponding to areas of interest, and fixation count corresponding to non-areas of interest, in particular. The proposed model identifies human implicit visual search intention as task-free visual browsing or task-oriented visual search. Task-oriented visual search is further identified as task-oriented visual search intent generation, task-oriented visual search intent maintenance, or task-oriented visual search intent disappearance. During a visual search, measurement of the pupillary response is greatly influenced by external factors such the intensity and size of the visual stimulus. To alleviate the effects of external factors, we propose a robust baseline model that can accurately measure the pupillary response. Graphical representation of the measured parameter values shows significant differences among the different intent conditions, which can then be used as features for identification. By using the eye movement patterns and pupillary analysis, we can detect the transitions between different implicit intentions—task-free visual browsing intent to task-oriented visual search intent and task-oriented visual search intent maintenance to task-oriented visual search intent disappearance—using a hierarchical support vector machine. In the proposed model, the hierarchical support vector machine is able to identify the transitions between different intent conditions with greater than 90 % accuracy.

[1]  John R. Anderson,et al.  Automated Eye-Movement Protocol Analysis , 2001, Hum. Comput. Interact..

[2]  Peter Carruthers,et al.  The illusion of conscious will , 2007, Synthese.

[3]  Ling Xia,et al.  Eye tracking and online search: Lessons learned and challenges ahead , 2008, J. Assoc. Inf. Sci. Technol..

[4]  Thorsten Joachims,et al.  Eye-tracking analysis of user behavior in WWW search , 2004, SIGIR '04.

[5]  Michael D. Byrne,et al.  Modeling the Visual Search of Displays: A Revised ACT-R Model of Icon Search Based on Eye-Tracking Data , 2006, Hum. Comput. Interact..

[6]  J. Bradshaw Pupil Size and Problem Solving , 1968, The Quarterly journal of experimental psychology.

[7]  E. Hess,et al.  Pupil Size in Relation to Mental Activity during Simple Problem-Solving , 1964, Science.

[8]  Brian P. Bailey,et al.  Categories & Subject Descriptors: H.5.2 [Information , 2022 .

[9]  Susan T. Dumais,et al.  Hierarchical classification of Web content , 2000, SIGIR '00.

[10]  Hui Liang,et al.  Vision-based hand pose estimation and gesture recognition , 2015 .

[11]  Farrah Wong,et al.  Intention reading towards engineering applications for the elderly and people with disabilities , 2006 .

[12]  C. A. Weaver,et al.  Psychology of Reading , 2012 .

[13]  M. Pomplun,et al.  Pupil Dilation as an Indicator of Cognitive Workload in Human-Computer Interaction , 2003 .

[14]  W. Levelt,et al.  Pupillary dilation as a measure of attention: a quantitative system analysis , 1993 .

[15]  Raj M. Ratwani,et al.  A Real-Time Eye Tracking System for Predicting and Preventing Postcompletion Errors , 2011, Hum. Comput. Interact..

[16]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[17]  S. Sutton,et al.  Pupillary Response at Visual Threshold , 1966, Nature.

[18]  Kyung-Whan Oh,et al.  Intention Recognition using a Graph Representation , 2007 .

[19]  Alexander C. Schütz,et al.  Eye movements and perception: a selective review. , 2011, Journal of vision.

[20]  Jacek Gwizdka,et al.  Search behaviors in different task types , 2010, JCDL '10.

[21]  J. Beatty Task-evoked pupillary responses, processing load, and the structure of processing resources. , 1982 .

[22]  Ted Selker,et al.  Visual Attentive Interfaces , 2004 .

[23]  Minho Lee,et al.  Recognition of Human's Implicit Intention Based on an Eyeball Movement Pattern Analysis , 2011, ICONIP.

[24]  Daniel M. Russell,et al.  Discriminating the relevance of web search results with measures of pupil size , 2009, CHI.

[25]  M. Just,et al.  Eye fixations and cognitive processes , 1976, Cognitive Psychology.

[26]  Pat Hanrahan,et al.  Measuring the task-evoked pupillary response with a remote eye tracker , 2008, ETRA.

[27]  D Kahneman,et al.  Pupil Diameter and Load on Memory , 1966, Science.

[28]  Hsinchun Chen,et al.  Browsing in hypertext: a cognitive study , 1992, IEEE Trans. Syst. Man Cybern..

[29]  Sandra P. Marshall,et al.  Integrating psychophysiological measures of cognitive workload and eye movements to detect strategy shifts , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[30]  Kwang-Eun Ko,et al.  A study on hybrid model of HMMs and GMMs for mirror neuron system modeling using EEG signals , 2011, 2011 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2011).

[31]  F A YOUNG,et al.  Pupillary contraction and dilation in light and darkness. , 1954, Journal of comparative and physiological psychology.

[32]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[33]  B. C. Lacey,et al.  Pupillary and cardiac activity during visual attention. , 1973, Psychophysiology.

[34]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[35]  Othman Omran Khalifa,et al.  EMG signal classification for human computer interaction: a review , 2009 .

[36]  B. Goldwater Psychological significance of pupillary movements. , 1972, Psychological bulletin.

[37]  K. Rayner,et al.  The psychology of reading , 1989 .

[38]  Dario D. Salvucci Inferring intent in eye-based interfaces: tracing eye movements with process models , 1999, CHI '99.

[39]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[40]  Linden J. Ball,et al.  Eye Tracking in Human-Computer Interaction and Usability Research : Current Status and Future Prospects , 2004 .

[41]  Ricardo Carelli,et al.  Human-machine interfaces based on EMG and EEG applied to robotic systems , 2008, Journal of NeuroEngineering and Rehabilitation.

[42]  E. Granholm,et al.  Pupillary responses index cognitive resource limitations. , 1996, Psychophysiology.

[43]  M. Tomasello,et al.  Does the chimpanzee have a theory of mind? 30 years later , 2008, Trends in Cognitive Sciences.

[44]  Thomas Spyrou,et al.  Intention modelling: approximating computer user intentions for detection and prediction of intrusions , 1996, SEC.

[45]  Bryan Reimer,et al.  On-road driver eye movement tracking using head-mounted devices , 2002, ETRA.

[46]  Minho Lee,et al.  Human implicit intent transition detection based on pupillary analysis , 2012, The 2012 International Joint Conference on Neural Networks (IJCNN).

[47]  Fred Stentiford,et al.  Eye Tracking: A New Interface for Visual Exploration , 2008, Multimodal Processing and Interaction.

[48]  E. Hess,et al.  Pupillometry: The Psychology of the Pupillary Response , 1978 .

[49]  Nilanjan Sarkar,et al.  Anxiety-based affective communication for implicit human–machine interaction , 2022 .

[50]  E. Hess ATTITUDE AND PUPIL SIZE. , 1965, Scientific American.

[51]  Begnaud Francis Hildebrand,et al.  Introduction to numerical analysis: 2nd edition , 1987 .

[52]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[53]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[54]  Amanda Spink,et al.  Determining the informational, navigational, and transactional intent of Web queries , 2008, Inf. Process. Manag..

[55]  Päivi Majaranta,et al.  Gaze Interaction and Applications of Eye Tracking - Advances in Assistive Technologies , 2011 .

[56]  Vojislav Kecman,et al.  Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models , 2001 .

[57]  U. Schwarz,et al.  Cognitive eyes , 2004 .

[58]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[59]  Wei-Ying Ma,et al.  User Intention Modeling in Web Applications Using Data Mining , 2002, World Wide Web.

[60]  A. Paivio,et al.  The effect of word abstractness and pleasantness on pupil size during an imagery task , 1966 .

[61]  Gary Marchionini,et al.  Information Seeking in Electronic Environments , 1995 .

[62]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[63]  W. S. Peavler,et al.  Incentive effects and pupillary changes in association learning. , 1969, Journal of experimental psychology.

[64]  B. Tatler,et al.  Looking and Acting: Vision and eye movements in natural behaviour , 2009 .

[65]  D. Kahneman,et al.  Evidence for Alternative Strategies of Sentence Retention , 1971, The Quarterly journal of experimental psychology.

[66]  Herb M. Simpson,et al.  Changes in pupil size during an imagery task without motor response involvement , 1966 .

[67]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[68]  Claudio M. Privitera,et al.  The pupil dilation response to visual detection , 2008, Electronic Imaging.

[69]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[70]  Anthony J. Hornof,et al.  A Computational Model of “Active Vision” for Visual Search in Human–Computer Interaction , 2011, Hum. Comput. Interact..

[71]  Daniel E. Rose,et al.  Understanding user goals in web search , 2004, WWW '04.

[72]  Julie M. Harris,et al.  Characterising patterns of eye movements in natural images and visual scanning , 2008 .