An Experimental Study on Visual Search Factors of Information Features in a Task Monitoring Interface

This paper carries out an experimental study on eye movement tracking when performing different visual searching tasks on a task monitoring interface, from the perspective of psychometrics. Behavior and physiological reaction data have been obtained through experiments - firstly in a scenario where no visual searching task is requested, and secondly within three separate tasks where the subjects are asked to search for enemy information, threat information and data information, respectively. Eye movement data indexes in nine areas of the task monitoring interface have been analyzed for each task based on a division of the different task monitoring areas. The experiments demonstrate that the search path followed by subjects on the task monitoring interface show significantly different subject reaction times and eye movements when undergoing each different task, as the search path is influenced by task-driven cognitive information processing and information search time. Fixation duration, duration count and visit count also show significant differences in each different monitor area; there-fore information features distributed in the radar sub-interface can be easily captured, which have been proven to be related to task-driven automatic capture. In-formation position and features such as colors, shapes and sizes have a significant impact on visual searches as they can easily cause problems with information omission, misreading and misjudgment, missing/ignoring data etc. when under-going each different task. The paper concludes that monitoring tasks and the individual information features within in an interface have a great influence on the visual search, which will guide further research on design of information features in task monitoring interfaces.

[1]  Xue Chengqi Layout Design of Display Interface for a New Generation Fighter , 2011 .

[2]  Cuihong Zhou Numerical Study on Sludge Dewatering by Horizontal Decanter Centrifuge , 2014 .

[3]  Patrick Monnier Redundant coding assessed in a visual search task , 2003 .

[4]  Michael D. Byrne,et al.  Modeling icon search in ACT-R/PM , 2002, Cognitive Systems Research.

[5]  Xiaoli Wu E-C Mapping Model Based on Human Computer Interaction Interface of Complex System , 2014 .

[6]  Xiaoli Wu,et al.  Misperception Model-Based Analytic Method of Visual Interface Design Factors , 2014, HCI.

[7]  J. Theeuwes Top-down search strategies cannot override attentional capture , 2004, Psychonomic bulletin & review.

[8]  J. Theeuwes,et al.  Attentional control during visual search: the effect of irrelevant singletons. , 1998, Journal of experimental psychology. Human perception and performance.

[9]  J. diVita,et al.  A Redundant Use of Luminance and Flashing with Shape and Color as Highlighting Codes ,in Symbolic Displays , 2022 .

[10]  Xiaoli Wu,et al.  Study on Eye Movements of Information Omission/Misjudgment in Radar Situation-Interface , 2014, HCI.

[11]  李晶,et al.  Information encoding in human-computer interface on the equilibrium of time pressure , 2013 .

[12]  Justin G. Hollands,et al.  Beyond Identity: Incorporating System Reliability Information Into an Automated Combat Identification System , 2011, Hum. Factors.

[13]  Michael D. Byrne,et al.  Modeling the Visual Search of Displays: A Revised ACT-R Model of Icon Search Based on Eye-Tracking Data , 2006, Hum. Comput. Interact..

[14]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .